Sorry, but copying text is forbidden on this website!
Holography breaks through the density limits of conventional storage by going beyond recording only on the surface, to recording through the full depth of the medium. Unlike other technologies that record one data bit at a time, holography allows a million bits of data to be written and read in parallel with a single flash of light. This enables transfer rates significantly higher than current optical storage devices. Combining high storage densities, fast transfer rates, with durable, reliable, low cost media, make holography poised to become a compelling choice for next-generation storage and content distribution needs.
In addition, the flexibility of the technology allows for the development of a wide variety of holographic storage products that range from handheld devices for consumers to storage products for the enterprise. Imagine 2GB of data on a postage stamp, 20 GB oncredit card, or 200 GB on a disk. How is data recorded? Light from a single laser beam is split into two beams, the signal beam (which carries the data) and the reference beam. The hologram is formed where these two beams intersect in the recording medium.
The process for encoding data onto the signal beam is accomplished by a device called a spatial light modulator (SLM). The SLM translates the electronic data of 0’s and 1’s into an optical “checkerboard” pattern of light and dark pixels. The data is arranged in an array or page of around a million bits. The exact number of bits is determined by the pixel count of the SLM. At the point of intersection of the reference beam and the data carrying signal beam, the hologram is recorded in the light sensitive storage medium.
A chemical reaction occurs in the medium when the bright elements of the signal beam intersect the reference beam, causing the hologram stored. By varying the reference beam angle, wavelength, or media position many different holograms can be recorded in the same volume of material. How is data read? In order to read the data, the reference beam deflects off the hologram thus reconstructing the stored information. This hologram is then projected onto a detector that reads the data in parallel. This parallel read out of data provides holography with its fast transfer rates.
It examines the possibility of manufacturing the end product in the predetermined quantity with desired quality and the sale of the same results into adequate return to payback the investment made within a reasonable period of time over the project life with the help of facilities installed and resources employed Why would you even want a 1 TB CD? In five years, demand for such a product will exist with the addition of wireless streaming from the device. For example, users could scroll through 200 movies along with 10,000 photos and 50,000 songs on the CD.
As for the feasibility of such a device, 1TB CD is possible within five years. 1. 8-inch drive capacities should reach 500GB in that time, which would enable to create a 1TB CD with a two-platter drive, like it does now by combining two 30GB platters to produce a 60GB CD. The gigabit-ethernet unit has two drive bays, each approved to accept a SATA hard drive up to 750GB, for a total of 1. 5TB. The company is still evaluating the feasibility of using two 1TB drives in the chassis–for a total of 2TB–instead.
We have planned to simulate the Fast Ignition with self consistent fields in full parameters. Our final goal is 500 x 20 x 20 microns up to 100 times critical density plasma and our target is, of course, a 1 TB CD. Traditional PIC code requires an unrealistic huge number of particles to simulate such parameters and it is impossible to run the PIC code even on such massive parallel computer. Treating only super hot electrons as particles and other background electrons as a fluid is one method to solve this problem.
As super hot currents are in the order of 100-1000 MA, return current electrons are too warm and the cold return current should not be treated as same temperature as background electrons. Thus we should employ two temperature electron fluids for return current and background. This hybrid approach, however, has uncertainty about a return current temperature and densities of both fluids. To avoid these difficulties, and to reduce the number of particles and computations, we have done one dimensional feasibility study for the PIC code with the collective particles and found that the 1TB CD worked well… but it is still in the R & D stage.