The future of archiving

We all know content is king but just as important is the need to prepare, protect and preserve the content through viable and long lasting solutions.

Within the video production industry alone, data-heavy content such as HD, 4K and high frame rate videos are exponentially growing by the day. Subsequently, these newly created assets need to be managed effectively, stored safely, and utilised along with the old assets.

Broadcasters, production companies, and other content holders are not only handling large and growing quantities of daily content, but are also much concerned with digitising the massive VTR assets currently sitting on shelves. As it becomes increasingly possible to effectively manage and rapidly search these materials via shared networks, new potential is discovered for the reuse of such assets.

As data volumes rise, so do storage costs – making it essential to implement storage systems that distinguish between hot (frequently accessed), warm (occasionally accessed), and cold (infrequently accessed) data, and se-lecting the best storage media for each. The main obstacle for content owners is to ensure the ability to preserve, access, and re-use their valuable assets without incurring repeated investment, and huge running costs.

The requirements of long term reliability, the ability to maintain large quantities of data at relatively low costs, and the ability to maintain data integrity in “green environments” with limited environmental controls are essential.

Sony is convinced that optical disc storage fills all of these requirements, and is therefore ideal for warm and cold storage. The new technology, with open and non-proprietary formats, involves the use of multiple bare discs contained within a robust cartridge and a dedicated disc drive unit with an associated software driver able to manipulate the discs individually – providing a seamless read/write capability.

The non-contact read/write technology offers the ability to access data with remarkable speed compared to tape data that necessitates the physical fast-forwarding and rewinding of a tape, 800 metres or more in length, until the location of the required data is reached. Also, it’s never going to jam, tangle or snap.

The fact remains that optical discs are considerably more durable than hard-disk storage systems or magnetic tape based media, with a 100-year shelf life expectancy. The system is highly reliable and optimised for long-term archiving. It also succeeds in keeping down total archiving costs, and has a low environmental footprint. It offers accessibility, high speed, and can be scalable to fit the users’ needs which can begin with small archive stored on a few shelves, and expand into a large library as data accumulates.

Whilst the professional AV media industry has moved steadily from its tape based origins toward file based workflows for acquisition, post-production and distribution, the archive domain continues to remain largely tape-based. An alternative modern day solution, the Optical Disc Archive (ODA), has been created helping organisations achieve safe, long-term storage of video, photos, text, and other important digital assets.

Both LTO magnetic tape and Optical Disc Archive are viable cold storage options. The most common complaint from the user community is the constant need to migrate valuable assets from one form of tape media to the next version upgrade simply to maintain a viable archive. This requirement for copy migration every two generations (approximately five to six years), incurs substantial media and labour costs.

By leveraging the proving optical technologies, and inter-generational compatibility of optical discs, ODA technology can store important data safely, eliminating the need for migration every few years. This eliminates the need for media, hardware and software re-investment, as well as the cost of human resources required to perform copying work, resulting in reduced total cost of ownership.

 
ODA solutions are also ideal for deep archive, whereas data tape does not provide the assurance or meet the need for very long-term archive requirements. It also provides a second copy broadcast archive solution at a remote site and is suitable for business continuity, disaster recovery, post house and production back-up and, for video, film and stock footage archives or AV national archives. The system can also be used for news and sports clips that need to be near-online and as an on-line browse and proxy clip store.

Recently, Sony unveiled the second generation of its Optical Disc Archive System, which doubles the capacity of a single cartridge, and doubling read/write speeds over the previous generation, accommodating 4K video in real time, and maintains backwards read compatibility with first generation optical disc drives.

Optical disc archive can serve as the core of highly productive archive systems capable of managing and storing valuable, high-volume data—including 4K video, future-generation video, older video assets, and multimedia video content.

The technology is future proof and achieves a revolutionary jump in the world of data storage ideal for any circumstance.

Now and in future, this system delivers an efficient, secure, and reliable archive solution. Unlike data tape technology, where you have to migrate your content or you can’t access it as your technology moves on, the Optical Disc Archive media written today, will be readable by the drives of tomorrow.

Bottom line is with the current region pre-dominantly tape based, it is now time to transition towards the future. Optical disc archiving is the way forward – it is a solution that is long term, economical, and ultimately scala-ble to grow with your business.

Written by Nabil El Madbak

 

Source: ScreenAfrica

From floppy disks to deep freeze: what’s the best way to store data?

A New York-based team of volunteer archivists and preservationists are working to transfer old VHS videotapes into digital formats. Volunteers meet weekly in a Tribeca loft filled with “racks of tape decks, oscilloscopes, vectorscopes and waveform monitors” to painstakingly digitize cassettes from the 1980s and 1990s. As they note, transferring video isn’t plug-and-go; much tweaking and troubleshooting can be required to get it right. That’s why they’ve only managed to transfer 155 tapes so far – a very small percentage of the total analog format archive.

The group partners with artists, activists, and individuals to lower the barriers to preserving at-risk audiovisual media – especially unseen, unheard, or archived works.

Whatever the content, once it’s digitized, it becomes publicly available via the Internet Archive.

And what about your own tapes? There are plenty of paid services that will help you to digitize old videotapes – or you can do it yourself using directions from open sources. And if you still have a big dusty box of your home video tapes stored somewhere deep in the closet, it may be a good idea to transfer their contents on the new storage mediums. In fact, we have already discussed that in one of our previous articles.

Tape manufacturers predicted 20 to 30 years of life expectancy, but media lifespan depends greatly on environmental conditions. Format obsolescence contributes to the crisis: Umatic and VHS tapes are no longer manufactured and BetaSP will soon be discontinued. Machines to play these formats are becoming more scarce as are the skills to maintain and repair them.

Of course, it’s not only the videotape that’s at risk. Entropy is relentless, and anything recorded on the old storage mediums will eventually have to be transferred and digitalized. Even if the medium remains intact, formats and interfaces become obsolete and disappear. Preserving data for the long term is a discipline worth more attention than we can give it here, but a few tips might be helpful.

Lifespan comparison of different backup storage media

 

Keep track of how long media is likely to last – but remember that the statistics are controversial projections, and many won’t be so precise. The general consensus is that consumer segment CD-Rs should last 30 to 50 years, DVD-Rs less than that, and CD-RWs and DVD-RWs even less. Similarly, tapes and hard disks can be expected to be readable for 10 to 30 years, while portable disks, USB thumb drives, and other solid-state storage devices may survive for half that time, maybe.

Back in 2005, The New York Times reported that 3.5” floppies have “an estimated life span of 10 years if stored in a cool, dry place with average care and use”. If you’ve still got any, we’ll bet they’re older than that!

With this in mind, regularly copy data to new media, especially if it’s approaching its expiration date. And make sure anything you haven’t saved is “in a cool, dry place,” not your attic or garage. It is strongly recommended to use a specialized archival optical media, like FalconMedia Century Archival, which are able to secure your data for up to 500 years.

Move away from physical formats that are becoming obsolete. For example, many people who used to back up their data on Zip drives, Syquest cartridges, and 1.44MB floppy drives no longer have access to these. Even interfaces can be an issue: external devices often used serial or parallel ports that no longer ship standard on computers (though desktop PC and ExpressCard laptop adapters can still be found). Make sure you’ve migrated your data before you dispose of an old device or format.

A common related issue: data trapped on a working hard disk in a dead PC or laptop. The Guardian serves up some useful guidance on installing the drive in an external USB enclosure and restoring from there.

Migrate data from obsolete programs, or at least make sure you have the tools to do so when necessary. Millions of people still have content trapped in ancient word processing. Tools for viewing such data or move it into “living” software include Quick View Plus and FastLook; for some formats, the free LibreOffice productivity suite or XNView image viewer might be all you need.

TechRepublic offers some useful high-level advice on planning a long-term strategy for protecting your data here. 

All this is great as far as it goes, but as the amount of data we’re generating continues to soar, we’re likely to need some radically new. Here are some technologies that may potentially improve data storage in near future:

Analog micro-etching: The Long Now Foundation  – which specializes in trying to envision the long-term future and solve the problems it might present – ran a full conference on super-long-term data storage. The solution it found promising enough to test: analog micro-etching onto nickel disks. Eight years later, they had a prototype: a disk containing information in about 1,500 human languages, plus translations of the Book of Genesis in each. Since the information is analog, it’s readable directly by humans (though they will need a microscope).

The Arctic World Archive: Officially opened on March 27 in Norway’s Svalbard Arctic region, the for-profit Arctic World Archive is already housing key documents from Brazil, Mexico, and Norway — safe, theoretically, from natural disaster and warfare. According to a report in The Verge, data is actually imprinted on special film, in huge high-density greyscale QR codes – and the archive is completely disconnected from the Internet to protect against hackers and ransomware.

DNA:  According to Science Magazine, researchers have been making breathtaking progress since the first attempts to store data in DNA molecules back in 2012. DNA is ultracompact, and it can last hundreds of thousands of years if kept in a cool, dry place. And as long as human societies are reading and writing DNA, they will be able to decode it – not something you can say with confidence about videocassettes or QR codes.

 

Source: Naked Security

 

How smartphones became our personal portable data banks

During last couple of decades, it is stating the obvious to say that mobile telecommunications has entirely changed the world we live in. Over this period we have gradually switched from handwritten paper contact notebooks to electronic contact records in our mobile phones, however it is not only the phone numbers we store in our phones anymore: with the development of smartphones, they became or own personal data banks.

Alongside the contacts, our small electronic friends now store so much data (passwords, photos, music, sometimes even medical records and biometric parameters etc.) that losing our device would most likely be a total disaster for one’s day-to-day routine. Some people even don’t remember their passwords from social media accounts, because their phones keep them securely stored in their memory.

Loss of that sensitive data may be a problem, but it is transfer to another device is even more problematic.

Everyone has, least once in their life, switched from one mobile phone to another. Back in 2000’s it wasn’t such a big deal: you switched the SIM card and all your data is easily transferred to the new phone. Those were the good old days when the contacts were stored on the SIM card and there was no hassle with gigabytes of photos and music. These earlier phones pretty much were meant to ring and exchange texts back then: no cameras, no players, not even mentioning the mobile internet.

It was later in 2006-2007, when the smartphone market started to emerge, the problem of data transfer between two devices became bigger when different mobile operating systems decided to develop in completely different directions.

By the beginning of 2010’s it became obvious that data transfer between Android and iOS devices became so difficult and time consuming, that leading developers could not ignore consumers’ complaints anymore, and a revolutionary step was taken.

 As one of the market leaders and most innovative consumer electronics companies, Apple made iPhone owners’ life easier, by launching of the “Move to iOS” app that provided an easy way to move contacts and other data from an Android phone to an iPhone.

Google, as Apple’s biggest rivals on mobile operating systems market, developed similar technology in their own mobile device, called Pixel. They even included a dedicated adapter to make data transfer procedure easier.

According to Google, the new Pixel phones ship with a dedicated Switch capability that allows users to transfer contacts, calendar events, photos, videos, music, SMS messages, iMessages and more from one device to the other. Quick Switch Adapter technology is a dedicated On the Go adapter that is shipped within the Pixel box, which Google confirms within its Pixel specs.

Google describes the switch as a three-step process. Older phones have to run on Android 5.0 and up, or iOS 8 and above for iPhones.

If data has to be transferred from an older Android phone, the process is relatively simple. For iPhone users, they turn off iMessage and FaceTime, then remove the SIM card. Then it is required to sign into your Google Account from the Pixel. Finally, Google will ask the user to select what data needs to be imported.

Once that’s all decided, Google takes over and migrates the requested data. It’s as simple as that.

These developments are a great example of how consumers benefit from a healthy market competition and innovative thinking about how to secure personal data that smartphones carry nowadays. Anyone who is somehow related to IT industry will tell you: ALWAYS do backups of your data. No matter how secure you think all the cloud technology systems and personal hard drives are, just remember one thing: once data is lost – it is lost forever.

Therefore, as a conclusion to this article, we at Falcon Technologies International strongly recommend to use dedicated archival optical media solutions to store all the sensitive and valuable data, that you would like to keep secure for a long time. It doesn’t take too much time to burn couple of DVDs, but it will ensure that your data is insured.

At the Dawn of the Computer Age: Memories of the “Informational Revolution” Pioneers.

Do you remember your very first computer? Pretty much everyone does; most of the people in their mid-30s, early-40s can still remember these noisy big white boxes with huge square screens and clicking dial-up modems that took ages to download a plain-text news article or even a basic e-mail with no attachments. Well, it took almost 40 years for the technology to get to that point, and there are still alive today witnesses to how it all started in the basements of the world famous universities and colleges.

Joyce Wheeler is someone who saw it all in those early days. She also can still remember her very first computer, and one of the reasons for that is because it was one of the first computers anyone used.

Dr. Joyce Wheeler was among the pioneers of programming
Dr. Joyce Wheeler was among the pioneers of programming

 

It was EDSAC (Electronic Delay Storage Automatic Calculator), a “proto-computer” that was assembled and served scientists at the University of Cambridge back in 1949. Joyce Wheeler was a member of the scientists group who were working on their PhD degree under the supervision of famous astronomer Fred Hoyle. They were researching the reactions inside stars, in particular the star lifecycle stages and their length.

In order to perform the research, Joyce needed some powerful calculating equipment, since the inner workings of the nuclear furnace that keep stars shining is a very complicated problem to solve with use of a human brain, pencil and a piece of paper. Mathematics capable of describing this level of nuclear energy processes is pretty formidable: Joyce remembers that she had to solve a nasty set of differential equations that describe their behavior and composition.

A copy of Edsac is being built at the National Museum of Computing
A copy of Edsac is being built at the National Museum of Computing

 

Completing these calculations manually would almost certainly result in errors, inaccurate data and ultimately could – and probably would – affect the research outcomes. And here is where she met EDSAC – a machine built by Professor Maurice Wilkes, a technical device the size of several average size bedrooms, that was there to do the kind of calculations that Ms. Wheeler needed to be done to complete her advanced degree.

The first challenge for young astrophysics student was to learn the sophisticated language that machine could understand. She was quite familiar with the machine itself, since it was showcased to her prior the start of her degree course in 1954. Being keen to get her research done accurately, Joyce sat down with an instructions booklet and worked her way through dozens of the programming exercises from that pioneering programming manual. That little book was called WWG (after the names of the authors: Maurice Wilkes, David Wheeler and Stanley Gill).

The foundations of programming were laid down by Edsac's creators
The foundations of programming were laid down by Edsac’s creators

 

While learning the programming, Joyce (whose family name was Blacker at that time) got talking to David Wheeler, since one of her programs was helping to ensure that EDSAC was working well. They eventually got to know each other, fell in love and married in 1957.

Joyce remembers that exciting time in detail: she could not stop wondering what the machine could do for her work. She was able to study the programming quite fast due to her strong mathematical background: she became very quickly able to master the syntax into which she had to translate the endless complex equations.

At certain point of time she realized that programming is very similar to Maths in the sense that one can’t do it for too long.

“I found I could not work at a certain programming job for more than a certain number of hours per day,” Joyce Wheeler remembers. “After that you would not make much progress.”

Research students like Joyce Wheeler had to use Edsac at night
Research students like Joyce Wheeler had to use Edsac at night

 

Sometimes the solution to some programming problems that worried her from time to time would come into her mind while she was doing some other things outside of the computer lab: like doing the laundry or having lunch.

“Sometimes it’s better to leave something alone, to pause, and that’s very true of programming.”

When the programming bit was finally done, Joyce Wheeler was allocated a timeslot to run her programs on the EDSAC: it was Friday night. She remembers that this period was perfect for her: there were no lectures the next day she had to attend.

As an operator she was granted the right to run the EDSAC alone, but she had to make sure that everything she did was recorded. A quite common occurence for all the early computers (and EDSAC was no exception) was unexpected crashes. Joyce remembers that only occasionally she was lucky enough to keep machine running all night, and if it did crash, there was little she was allowed to do to try to fix it. Even the cleaners were not allowed to get near EDSAC.

Dr. Wheeler showed Joyce one procedure, that allowed the recalibration of the EDSAC’s two kilobyte memory, but if that did not help, Joyce had no other choice but to stop her work for the night. But despite the regular crashes, she made steady progress on finding out how long different stars would last before they collapsed.

“I got some estimates of a star’s age, how long it was going to last,” she said. “One of the nice things was that with programming you could repeat it. Iterate. You could not do that with a hand calculation. We could add in sample numbers on programs and it could easily check them. I could check my results on the machine very rapidly, which was very useful.”

Now, you should understand that “rapidly” back in the 1950’s meant “not more than 30 minutes”. This is the time that EDSAC required to run a program. After that the results were printed out for the researcher to analyze them. After that you had to re-program and wait another couple of days to run another round of complex calculations. Despite all these delays, Ms. Wheeler felt that she was a part of something that would change the world.

“We were doing work that could not done in any other way,” she said. And even though EDSAC was crude and painfully slow by modern standards, she saw that a revolution had begun.

 

 

We at FTI never fail to get inspired by pioneering scientists like Dr. Wheeler, with their single-minded dedication and commitment to innovating new solutions to existing problems, often in lonely circumstances and running against the tide of conventional thinking, driving them to expand the frontiers of discovery and learning in ways that eventually become implemented into normal life for the entire global population. Innovation, research and patience are some of the core values we cherish at FTI, and no-one demonstrated these better than Dr Joyce Wheeler.

Recovering Old Files: Challenge or Lesson to Study?

Informational technology is developing so fast that data we stored only a few years ago is often stuck on old disks with very few drives that are able to read them. Computer forensics experts can uncover old files, sometimes solving crimes along the way.

Computer forensics specialist is investigating an old laptop
Computer forensics specialist is investigating an old laptop

 

When new files relating to the South Yorkshire Police’s handling of the Hillsborough disaster emerged, plenty took the form of dog-eared notebooks and water-damaged folders. But in amongst the evidence handed over to the Independent Police Complaints Commission, there were also 167 floppy disks – containing hundreds of documents that were potentially critical to the investigation.

Paper is a reasonably useful medium; if it gets a bit damp or dirty you can still read the marks that have been made on it. But the same can’t be said for magnetic storage. The exhibits manager for the inquiry, David Wolstencroft, and his team had to purchase an ageing computer in order to read the 5.25-inch floppies – disks much bigger than the 3.5-inch ones most of us remember using on our PCs before they died out, and a medium already well on the way to obsolescence at the time of the 1989 disaster. “We got them all read,” he told the BBC when the analysis was just getting under way in 2013, “and they’ve come back on two small disks [DVDs] that aren’t even full. It’s unbelievable the way technology has changed.”

Old floppy discs may still be a source of important evidences of the old cases
Old floppy discs may still be a source of important evidences of the old cases

 

The process of transferring data from old media such as floppy disks to more modern, readable formats might sound relatively easy, but the transient nature of modern technology can make it hugely problematic. Tracking down and purchasing a computer from the era of shoulder pads and Crocodile Dundee would seem like a promising initial step, but the subsequent journey is uncertain.

 

“If you boot up an old machine,” says Tony Dearsley, Principal Consultant at digital forensics firm Kroll Ontrack, “you have no idea what’s going to happen. It will have been sitting in a cupboard for 15 or 20 years. You’re going to have component failure, capacitors are going to die. Ideally you’d try to avoid doing that altogether.”

 

Attempting to boot up and old PC very often end up at this point
Attempting to boot up and old PC very often end up at this point

 

Our increasing reliance on technology and the related increase in the digital information we all generate has fuelled a massive rise in the number of firms offering digital forensics services. But when old cases are reopened and dusty technology resurfaces, experts face multiple challenges.

Floppy disks, from the 8-inch to the 5.25-inch to the 3.5-inch, become less willing to yield up their contents with every passing day. Even hard disks, which theoretically have some measure of protection from the atmosphere, still experience decay.

Data storage mediums have changed so many times during the last 50 years, that it seems like we are still looking for a perfect way to store the huge volumes of information we generate on a daily basis. And as years go by, the compatibility of data carriers and readers becomes a bigger and bigger problem. If only data archiving would be somehow standardized, then we would we most likely not face such challenges anymore?

Many kinds of sensitive data have to be stored and preserved in a way that can be easily accessed after years, decades or even centuries. As has been touched upon already, paper is a good medium of storage, but it has a number of disadvantages: beginning from large physical volume per unit of information and ending with poor level of environmental influence resistance.
Century Archival DVDs are able to secure data for hundreds of years
Century Archival DVDs are able to secure data for hundreds of years

 

Data storage experts agree that as of today, there is no better alternative for optical media in data archiving. Professional archival grade DVDs and CDs with gold and platinum layers are able to secure data for centuries, fact proven by a number of professional tests under severe environmental conditions.

Falcon Technologies International has a specially designed product line called Century Archival, that is a perfect solution for a long term data storage. It is not only a cost-efficient solution, but also a guarantee of secure and lasting data storage.