Have you found one day that your mobile phone had just died and would never turn on again? The first issue many people think at this situation is probably the loss of the data on their devices. After several years using any data storage device, you would probably have hundreds of photos, files, notes and messages stored on it. You may have records of conversations that are important to you. You may then try using data recovery programs but the problem would probably be more complicated if your device was stolen or you simply lost it.
The situation is much more critical when taking into consideration institutions as government, companies, or research and educational institutes. For example, the data center of the US National Security Agency covers around 1,000,000 square feet. One of the most rapidly growing data storage is Facebook which increases its storage capacity by 7,000 terabytes monthly to host the media posted by users. Facebook currently operates 12 data centers covering a total area of about 15,000,000 square feet. Other examples for massive data storage include Google, Instagram, Twitter and Hotmail. Studies show that worldwide data will grow to around 163 zettabyte by 2025 which is equivalent to 163 billion terabyte.
Developers of data storage technologies are working hard to provide more efficient, less energy consuming, safer and higher-capacity data storage devices. Among the major changes in data storage is the widespread use of cloud data storage services in which the data are stored at a remote place or may even be distributed on several locations that are managed by a hosting company. The concept was invented in the 1960s and was first offered as a service in 1983 by CompuServe Company. However, the concept remained suspicious with significant concerns over the safety and privacy of the shared files. It remains unclear when exactly this changed and cloud storage became completely acceptable for people starting from individual users to multinational enterprises.
Nasuni, which is a private cloud storage company, released a study showing that the size of data stored on clouds in 2018 reached 1 exabyte which is more than 1 billion gigabyte. Almost all mobile users around the world are currently storing backup of their data on the cloud while more companies are moving their processes to the cloud including Apple, Netflix, Instagram and Xerox.
One of the reasons of the emerging needs for larger data storage capabilities is the increasing use of artificial intelligence technologies. These technologies depend mainly on the availability of large datasets that allow the machine to conclude the patterns. On the other way round, artificial intelligence may be used to enhance the efficiency of data storage systems. Machines may learn the most common trends in data usage and will then optimize the available resources for the best performance.
For example, machines may identify the rush hours that differ according to time and location, and then dedicate more processors to this locations at that time. Old data that is less commonly retrieved for example may be stored in remote slower data centers that consume less energy. This approach is expected to significantly increase the efficiency of data storage and processing and to reduce energy consumption and cost.
It can also be safely predicted that magnetic hard disk drives (HDDs) are going to be obsolete as they are continuously being replaced by Solid State Drives (SSDs). The latter type of drives uses integrated circuits for data storage without the involvement of any moving parts. They consume less energy and are more suitable for mobile devices including phones, laptops and tablets in which traditional HDDs may get damaged easily. Intel has recently introduced “ruler” style drives that are designed to be packed easily into a standard rack. Intel says their new drives allow the storage of 1 petabyte of data into 1/100 of the space required by available HDDs.
New technologies are still under development to get more efficient systems. For example, the University of Nottingham announced the development of a technology that uses electric field to control memory instead of magnetic fields. They called this technology Magnetic Domain Walls. On the other hand, Moscow Institute of Physics and Technology announced the development of magnetoelectric random access memory (MELRAM) which is claimed to reduce the energy consumption of data storage by a factor of 10,000. This technology uses magnetism in binary directions representing either 0 or 1.
Chemists at Case Western Reserve University proposed a completely new method for data storage that does not depend on the traditional binary coding. They are proposing the use of quaternary (4-symbol) code based on dyes placed on polymer films. It is claimed that this method would allow reducing the storage space needed for certain amount of data to the half. However, such approach will require the recoding of all the data generated till now which may be possible using automated data converting programs. Another more challenging proposed approach is the use of DNA for data storage similar to what is used in living cells. This approach would then require tight control of DNA writing and reading techniques which are not yet available.
One of the most advanced technologies of data storage that has long been investigated is the single-atom data storage. The ultimate goal of such approach is to store bits of data by placing single atoms on a certain surface. The data can then be read through quantum mechanics.
EPFL in Switzerland were leading that field and were the first to demonstrate that single-atom magnets can be used to store and read data. Physicists at this Institute published a recent study in Physical Review Letters showing the stability of a magnet consisting of a single atom of holmium. Researchers exposed their single-atom magnet to harsh conditions including high temperatures and magnetic fields that may demagnetize single-atom magnets.
Finally said, the race to new data storage technologies will continue in order to hopefully fulfill the exponentially increasing data volumes, which are estimated to be 15 million gigabytes of data daily.