Skip to main content

Some Amazing fact about memory (petabyte,exabyte,zettabyte) And How much data exists on the Web?

Now the Digital world is growing to fast that our need for storing data is increasing day by day. I remember once our computer teacher told that he purched a  Personal Computer XT/370 and paid more then a lakh for it and cobtainns 512K of memory chips.but Now you can you can find 64bit computers with 8 gb of ram laptops . Even mobiles come with 3 Gb ram(Note 3 of samgang)

And going towards harddisk we are no more talking about GBs now its all about Terabytes . And not we stop here we moved toward Petabyte(1000 TB). I here remember
 Moore’s Law, which says that the number of transistors on a chip doubles every 1.5 to 2 years.

Let me share on intresting fact:-

The biggest photo sharing website Facebook has over 15 billion photos. For each uploaded  photo, Facebook generates and stores four images of different sizes, which translated to a total of 60 billion images and 1.5 petabytes of storage, to date this will be even greater.

 If the average MP3 encoding for mobile is around 1MB per minute, and the average song lasts about four  minutes, then a petabyte of songs would last over 2,000 years playing continuously.
    If the average smartphone camera photo is 3MB in size and the average printed photo is 8.5 inches wide, then the assembled petabyte of photos placed side by side would be over 48,000 miles long - almost long enough to wrap around the equator twice.
      One petabyte is enough to store the DNA of the entire population of the US – and then clone them, twice.
        If you counted one byte per second, it would take 35.7 million years.

        It would take 223,000 DVDs (4.7Gb each) to hold 1Pb.

        It would take 746 million 3.5-inch high-density floppy discs (1.44Mb each) to hold one petabyte; 746 billion floppy discs weigh 13,422 tonnes (if each one weighs 18g)

        Astronomical project : “The Square Kilometer Array (SKA), funded by 20 countries to the sum €1.5bn is a radio telescope that can read faint signals from the Big Bang." 
        The SKA (due to be completed in 2024), estimates Big Blue, will generate 1,376 petabytes per day, twice daily global internet traffic.

        Facebook was storing 100 petabytes one year ago, according to its IPO filing to the US Security and Exchange Commission on 1 February, 2012.

        Google processed about 24 petabytes of data per day in 2009.
         It is estimated that the human brain's ability to store memories is equivalent to about 2.5 petabytes of binary data

        How much data exists on the Web

        Now a big question raise here that if i want to download internet then what should be the size of hard-disk.
        or in other words How much data exists online?
        So here is the Answer according to some researches it is  1.2 Zettabytes (1.3 trillion gigabytes) is now stored. So if you want to download the complete data from internet and you are getting the speed of 
        100MB/sec then it willl take 380517 Years  to complete.

        This year it is estimated that 70 percent of the data coming on internet will be user genrated and it will be about  900 Exabytes.

        You may also like:-

        How to Reduce Garbage-Collection Pause Time?

        4 Mains causes of Long Garbage Collection Pause?

        How Application Performance get impacted by Garbage Collection?

        Java Performance Tuning options Java heap

        Why Memory leak in Java ?

        What is a Memory Leak in java?

        What is Garbage collector in java and how it works?

        What is Garbage Collector Compaction in Java?

        What are Java heap Young, Old and Permanent Generations?

        What is difference between Web Server and Application Server ?

        What is the maximum ram for 32 bits and 64 bits computer?

        Some Amazing fact about memory (petabyte,exabyte,zettabyte) And How much data exists on the Web?

        Now have some jokes:-
        Best Alok Nath jokes    Best C.I.D. returns Jokes    Rajinikanth 99 KILLING JOKES Best Kejriwal jokes  Jethalal JOKES     Husband wife awesome jokes  Winners at 59th Idea Filmfare Awards 2013      Girl Friend boy Friend Awesome jokes    Hi-tech salespeople DictionarY   Student Teacher ever best jokes   FriendShip SMS and Quotes
        Hansa Praful jokes     Puzzle Messages     Life SMS    WhatsApp Status messages  Sweat Msg on MAA    Seminar of wifes   jokes on Husbands
        What's Marriage?(jokes)   Smart kids jokes(god is missing)
        Amazing mathematics of life    TV Anchor joke    Best suspense jokes jokes-on-husbands

        Popular posts from this blog

        What are Java heap Young, Old and Permanent Generations?

        The Java heap is dived into two parts i.e. Young generation and old generation. Young generationYoung generation  memory is 40% of Maximum Java Heap. It consists of two parts, Eden space and Survivor Space. Eden SpaceInitially objects are created in this  Part of Java Heap , here most objects die and quickly are cleaned up by the minor Garbage Collectors (It only runs in young generation and its frequency is high in compared to major garbage collector). Usually any new objects created inside a Java Method go into Eden space and the objects space is reclaimed once the method execution completes. Whereas the Instance Variables of a Class usually lives longer until the Object based on that class gets destroyed. When Eden fills up it causes a minor collection, in which some surviving objects are moved to Survivor Space or an older generation. Survivor Space  The pool containing objects that have survived the garbage collection of the Eden space The parameter Survivor Ratio can be used to tune…

        Compression of HttpServletRequest and HttpServletResponse by gzip encoding

        This filter is on HTTP headers in a HttpServletRequest, compress data written to the HttpServletResponse, or decompress data read from the request. When supported by the client browser, this can potentially greatly reduce the number of bytes written across the network from and to the client. As a Filter, this class can also be easily added to any J2EE 1.3+ web application.


        1).Add the pjl-comp-filter-XX.jar file containing CompressingFilter to your web application's WEB-INF/lib directory.

        2).Add the following entries to your web.xml deployment descriptor:



        The below data is of my application where i tested this compression technique…

        4 mains causes of Long Garbage Collection Pause?

        So a long GC Pause can have a direct impact on response time and throughput. But there are some causes for long GC pause which you can check and avoid to increase performance. They are as follows:-

        1)Insufficient heap size:- If heap size is small then the size required then the allocation requests for new objects  fail and the JVM needs to invoke garbage collections in an attempt to reclaim space for the allocations. So these frequent Full GCs cause long pauses in the application run. To avoid long GC pause by this identify the average footprint of the application and then specify the heap size accordingly.
        2)Fragmentation in the heap:-GCs can occur more frequently due to fragmentation in the Java Heap  and also sometimes causing long pauses in the GCs. This is more probable in the case of Concurrent Mark Sweep collector, also known as CMS, where the tenured generation space is not compacted with the concurrent collections. For more please go through the previous post about heap  compact…