How many bytes of information in one human body?

Discussion in 'Computer Science & Culture' started by Magical Realist, Sep 16, 2013.

  1. Magical Realist Valued Senior Member

    Messages:
    16,700
    "For example, how many megabytes of genetic data are stored in the human body? For simplicity’s sake, let’s ignore the microbiome (all non-human cells that live in our body), and focus only on the cells that make up our body. Estimates for the number of cells in the human body range between 10 trillion and 100 trillion. Let us take 100 trillion cells as the generally accepted estimate. So, given that each diploid cell contains 1.5 GB of data (this is very approximate, as I am only accounting for the diploid cells and ignoring the haploid sperm and egg cells in our body), the approximate amount of data stored in the human body is:

    1.5 Gbytes x 100 trillion cells = 150 trillion Gbytes or 150×10^12 x 10^9 bytes = 150 Zettabytes (10^21)!!!"


    This is truly an enormous amount of data! To get an idea how big just 1 zettabyte is, consider this:

    "Just how big is a zettabyte? That's the question one blogger is attempting to understand after reading a report that states humans will create about 1.2 zettabytes in 2010.

    In a relatively humorous and sarcastic post, blogger Paul McNamara looks at a recent news release from EMC and attempts to understand, and help us understand, just how big a zettabyte is.

    Here are three analogies from the EMC article:

    • "The digital information created by every man, woman and child on Earth 'Tweeting' continuously for 100 years."

    • "75 billion fully-loaded 16 GB Apple iPads, which would fill the entire area of Wembley Stadium to the brim 41 times, the Mont Blanc Tunnel 84 times, CERN's Large Hadron Collider tunnel 151 times, Beijing National Stadium 15.5 times or the Taipei 101 Tower 23 times."

    • "A full-length episode of FOX TV's hit series 24 running continuously for 125 million years."

    As McNamara points out, these analogies are oftentimes outside the scope of what the normal human -- even the normal IT person -- can fully grasp. For example, the "tweeting" example requires the knowledge of how many humans there are in the world (which sources can't seem to agree on) and some fairly fuzzy math, since the unit of time is 100 years.

    Beyond the analogies, the EMC article outlines the fact that humans created 800 billion gigabytes (800 million terabytes, 800,000 petabytes, 800 exabytes, or 0.8 zettabytes) in 2009 and, based on the growth of 62% over 2008's data, posits that humans will create about 1.2 zettabytes in 2010. This number is just massive, especially considering most humans have never even seen a terabyte of data. Even the storage experts among us are only considering data storage in the dozens to hundreds of terabytes. When you add it all up -- all the big companies, governments, healthcare institutions, big-budget CGI movies (Avatar) and the like -- 800 million terabytes does not seem outstanding for the entire world's storage.

    Don't plan on seeing petabyte hard drives any time soon, let alone zettabytes; some of the largest storage arrays are only holding about half of a petabyte. In addition, this count of 1.2 zettabytes is highly inflated; much of the world's data (some say 75%) is copies. In truth, only 0.3 zettabytes of unique, new information will be created this year.

    The report estimates that data creation will grow 44-fold by 2020, putting us well on our way toward yottabytes. Wrap your head around that, if you can."-http://www.techrepublic.com/blog/datacenter/goodbye-petabytes-and-exabytes-hello-zettabytes/2637

    Please Register or Log in to view the hidden image!

     
  2. Google AdSense Guest Advertisement



    to hide all adverts.
  3. Idle Mind What the hell, man? Valued Senior Member

    Messages:
    1,709
    Well, all diploid cells have nearly identical genetic information in them, meaning there is 1.5 GB of data that is backed up anywhere between 10 and 100 trillion times. Amazing redundancy.
     
  4. Google AdSense Guest Advertisement



    to hide all adverts.
  5. Sarkus Hippomonstrosesquippedalo phobe Valued Senior Member

    Messages:
    10,397
    I would tend to agree.

    Also, I would suggest that much of the "information" is perhaps procedurally generated, meaning that the actual storage would be far less.
    I recall a PC game, called .kkreiger, that was similar to Quake, that was written in 2004.
    This game, if coded normally, would have taken up 200-300Mb but instead was programmed in just 96kb.

    So when we say 1.5Gb of information, is this the information that it produces, or the code that it contains (to continue the analogy)?
    How is this 1.5Gb figure established (even as an estimate)?
     
  6. Google AdSense Guest Advertisement



    to hide all adverts.

Share This Page