Turkey: Our Favorite Type of Viral

turkey-day

Share this infographic on your site!

Turkey: Our Favorite Type of Viral

It starts in 1621
With 53 pilgrims[3]
And 90 Native Americans

Gov. William Bradford sent 4 men to hunt birds
Wampanoag Indians contributed 5 deer.

Three days of feasting saw more venison than turkey.[2]
Meal: Goose, Swan, Duck, Venison, Lobster, Shellfish, Alcohol, Pumpkin

Sailing to a new land:
One supertanker can hold 10,000x the Nina, Santa Maria, or Pinta

We eat 61% of a supertanker worth of Turkeys on Thanksgiving

Yearly: [5]
With well over a supertanker of green beans
Another of pumpkin
And almost three supertankers of sweet potatoes

That ain’t your typical pilgrims fare!

With 108% of the 70’s turkey consumption today
We eat 16 lbs of turkey per person a year
Top producers:

[state, millions of turkeys]
Minnesota: 46
North Carolina: 36
Arkansas: 29
Missouri: 18
Virginia 17
Indiana: 17
California: 16

Everything’s bigger in turkey

1.)
53.5 million spectators watch the annual Macy’s Day Parade[6]
Requiring 300,000 cubic feet of helium to stay afloat[7]

2.)
In 2012 President Obama pardoned “Cobbler” a 19 week old, 40 lb turkey. [8]
Every year the president re-proclaims “a day of thanksgiving.” [9]

3.)
June is National Turkey Lover’s Month[8]

However you carve it, Turkey is our favorite type of viral.

turkey-day-cube

Citations:

  1. http://www.infoplease.com/spot/tgcensus01.html
  2. http://www.wickedlocal.com/capecod/visitor_guide/fun/x1945267987/LETS-TALK-TURKEY-5-myths-about-the-Thanksgiving-holiday?img=2
  3. http://en.wikipedia.org/wiki/Thanksgiving_(United_States)
  4. http://wiki.answers.com/Q/How_many_barreks_of_oil_does_a_oil_tanker_carry#slide2
  5. http://www.businessinsider.com/thanksgiving-food-statistics-2011-11#106-billion-pounds-8
  6. http://www.history.com/interactives/thanksgiving-by-the-numbers
  7. http://www.livescience.com/10300-helium-needed-fill-macy-parade-balloons.html
  8. http://americanhistory.about.com/od/holidays/a/thanksgiving_ff.htm
  9. https://ntuf.memberclicks.net/assets/documents/PublicRelations/2012%20annual%20report%20final.pdf

The Perfect Perfect Game in Baseball

Bill_James_BostonBaseball statistician Bill James, the inventor of sabermetrics, devised what he called a “game score” to measure pitching performance in a baseball game. The larger the game score, the stronger the pitching performance. James’s game score is calculated as follows, with points added or subtracted for every out that the pitcher records:

1. Begin with 50 points.
2. Add 1 point for each out (3 per inning pitched).
3. Add 2 points for completing the fifth inning and for each inning thereafter.
4. Add 1 point for each strikeout.
5. Subtract 2 points for each hit.
6. Subtract 4 points for each earned run.
7. Subtract 2 points for each unearned run.
8. Subtract 1 point for each walk.

These numbers make good intuitive sense: walks count negatively, any runs count negatively, every strikeout counts positively, etc. But this game score measure also has some quirky features. For instance, so long as the pitcher doesn’t allow any runs or walks, the game score will go up. Thus for a pitcher to get the maximal possible addition to his game score in a given inning, he should strike out six batters, with three getting on base, even though struck out, because of a dropped ball by the catcher, the other three getting struck out and called out, with three runners being left on base. Also, if the game can be prolonged for more than 9 innings without allowing a run or walk, that will increase the pitcher’s game score. This last feature of game score speaks to the pitcher’s endurance, and indeed, the highest game score ever recorded for a game is 153 and comes from a 26-inning pitching performance by Joe Oeschger back in 1920 (pitchers back then tended to pitch lots more games and innings).

When it comes to 9 full innings pitched (no more, no less), however, the highest game score ever recorded is 105. Cubs pitcher Kerry Wood got this game score against the Houston Astros on May 6, 1998. In that game, he struck out 20, allowed no walks but one hit. It was therefore neither a perfect game nor a no hitter but a shutout. The full game of Wood against the Astros can be seen here:

If Wood had struck out the Houston batter who got that one hit against him, Wood’s game score would have been 108, and it would have been a perfect game. So that raises the question, how high could the game score possibly get in a perfect game? Such a game would require 27 strikeouts. Doing the calculation according to the rules above yields a maximal possible game score for a perfect game at 50 + 27 + 10 + 27 = 114.

So, would a game score of 114 for a perfect game be what might be called a perfect perfect game? In such a game, every batter would strike out. Obviously, no perfect game ever pitched has seen every batter strike out. The most strike outs recorded in any perfect game have been 14 (Sandy Koufax for the Dodgers against the Cubs in 1965 and Matt Cain for the Giants against the Astros in 2012). One perfect game in recent times has even had as few as 5 strike outs (Dennis Martinez for the Expos against the Dodgers in 1992). And going back almost a hundred years, there was a perfect game with just 3 strikeouts (Addie Joss for the Cleveland Naps in 1908 against the White Sox).

Still, it seems that a game score of 114 for a perfect game, with 27 batters each striking out, might still indicate less than a perfect perfect game. If we think of a perfect perfect game as one in which the pitching performance could not be improved, then even in a game where at each at bat each batter strikes out, the best possible performance would be one where the pitcher throws no balls and the batter never even makes contact with the ball (in other words, the batter swings and misses or doesn’t even swing — this requirement for a perfect perfect game is justified because if the batter makes contact, it has a probability of getting into play and thus turning into a hit).

A perfect perfect game would therefore be a game with 81 pitches, each a strike, and each with the batter never laying bat on the ball.

Probabilistic Miracles: Why Nothing in Nature is Physically Impossible

The physically impossible differs from the logically impossible. It’s logically impossible that 2 plus 2 equals 5 or that a bachelor is married. On the other hand, it’s physically impossible to build a perpetual motion machine or for a human at a high-jump competition to clear 100 feet. The Second Law of Thermodyanmics is supposed to bar the first of these. As for the second, given human size and strength in the earth’s gravitational field, jumping 8 feet seems about the limit — the world record, set by Javier Sotomajor in 1993, stands at eight feet and one-half inches:

But what if something physically impossible did happen? Would it be a miracle? It depends on what one means by a miracle. The Latin word from which we get our word miracle (miraculum) denoted an object of wonder. Bascially, a miracle is something that causes us to go Wow! But look at Javier Sotomajor’s world record jump in the YouTube video above, and you’ll probably also go Wow! Yet you probably don’t want to call that a miracle. Sotomajor, through his natural physical talents and diligent training, got his body into a condition where it could clear 8 feet.

So a miracle isn’t something that simply causes us to be surprised, shocked, or put in a state of awe or wonder. When we call something a miracle, we usually have in mind something that also goes beyond the limits of what’s physically possible. Thus it seems that we do want a miracle to mean an event that is physically impossible. Often this is put in terms of the laws of nature forbidding the event from happening. But if the laws of nature forbid an event from happening, then how can it happen? In fact, many skeptics think that miracles are inherently self-contradictory: if something is physically impossible, then it can’t happen, and so whatever happens is not a miracle; on the other hand, if it did happen, it wasn’t physically impossible, so it wasn’t a miracle.

But what is this thing we’re calling “physical impossibility”? We can certainly imagine the occurrence of events that are physically impossible (for example, we can imagine a machine that without external energy maintains its motion for 100 years nonstop or Javier Sotomajor jumping St. Louis’s Gateway Arch). Moreover, if we can imagine these events happening, can’t we also imagine ourselves witnessing them and thereby being compelled to think that something physically impossible did happen, in which case we would be agreeing that a miracle occurred? To say that something is physically possible is to say it is compatible with the ordinary way that nature operates. Physical impossibility would therefore seem to say that nature can’t operate extraordinarily. But why should nature be limited in that way?

Dawkins-Religion-PhotoThe only way to block nature from operating extraordinarily is to say that nature operates by unbreakable laws that constrain it down certain paths and prohibit it from others. But how do we know that unbreakable laws determine the ordinary operation of nature, and thus prohibit its extraordinary operation? The typical way that philosophers and scientists argue this is by claiming that matter obeys certain unbreakable laws governing its operation, and that nature is identical with a system of matter. But even this argument loses its force once we acknowledge that matter can behave probabilistically. Consider, for instance, the following characterization of miracles by Oxford biologist and atheist Richard Dawkins: [Read more…]

A World Without the Post Office

usps

Share this infographic on your site!

A World Without the Post Office

The U.S. Postal Service, one of the few government agencies explicitly authorized by the U.S.
Constitution, has seen better days. After a plan by the post office to end most Saturday mail delivery,
a proposed federal budget mandated six-day delivery, despite post office officials’ contention that the
independent federal agency needs to trim costs. With steadily declining revenue, the specter of a world
without the U.S. Postal Service becomes more plausible each year.

The Size and Reach of the Post Office:

31,272

Postal Service-managed retail offices

212,530

Vehicles, one of the largest civilian fleets in the world

1.3 billion

Miles driven each year by letter carriers and truck drivers

40%

World’s mail volume handled by the Postal Service

8 million

Employees

438,000,000

Pieces of mail processed every day

$65 billion

2012 revenue

$1.8 billion

Salaries and benefits paid every two weeks

423 million

Annual visits to usps.com

5.7 million

Passport applications accepted every year

0

Tax dollars received for operations

152 million

Total delivery points

Blame It on Gmail? Mail Volume Falls

With easy, free access to email, and thus email marketing, the demand for mail has fallen over the past
decade.

2 in 3

Americans have access to email; that is expected to rise to 72% by 2017

More Than Mail: Our Love Affair With Stamps

The fare for sending a letter—the stamp—is a huge part of U.S. culture. We celebrate with holiday-
themed stamps; we raise money with charity stamps; and we collect old stamps, well, because they’re
cool.

Charity

Stamps sold slightly above the cost of a regular stamp, semi-postal stamps raise funds for causes
identified by Congress.

Commemorative Stamps

The post office has released dozens of commemorative stamps over the years, whether related to
music, entertainment or Americana. Here are the most collected commemorative stamps.

SOURCES:

U.S. Postal Service

findyourstampsvalue.com

eMarketer

usps-thumb

Getting Value for Your Money — And Why It Matters

Losing_Money

We all want a good deal, and we feel bad when we don’t get one, especially if we learn later that we could have gotten something much cheaper than what we actually paid for it. In that case, we feel ripped off — that we’ve lost something.

Most of us don’t have a lot of excess cash. If we don’t use it wisely by getting good deals, we can quickly get into unmanageable debt, which, if it doesn’t lead to a serious crisis such as foreclosure of one’s home, can lead to a more gradual pressure on our lives in which our resources and energies get stripped into the servicing of debt. This can become a prisoner’s existence, with debts the shackles.

So let’s say that in your personal life you are being responsible, living within your means, not paying more for things than you should, and not taking on more expenditures than you should. Let’s also say that you are giving people good value for your services, not gaming the system, creating value, so that your gain is not another’s loss. In that case, it becomes disheartening to realize that much of your money is still not being used responsibly — not by you but by government and others (usually empowered by government) to skim off your cash with the full consent of the law.  [Read more…]

Decoys, Neuromarketing, and Behavioral Economics

neuro-marketing

Richard Feynman, one of the best mathematical physicists of the last century, thought that a particularly important virtue of scientists was not to fool people. To this Feynman, with his keen sense of irony, added that the easiest person to fool is oneself. Feynman, here, was speaking to the confirmation bias that infects so much of human inquiry. In this we tend to find and justify the very things we most wish are true — even if they’re false.

One of the things we most wish to be true is to get a good deal. Whenever we’re putting money toward anything, be it shopping at the local mall or investing money in financial instruments, we want to make sure we’re getting the best deal possible. In particular, it grieves us if we find out later that we could have gotten the same product with substantially less money than we put toward it.

daniel_kahnemanHumans hate loss more than they enjoy comparable gain. This, and many other findings in the emerging field of behavioral economics, are shedding interesting light on our proclivities as buyers as well as on how to exploit those proclivities. Behavioral economics arose in the last 40 years, starting with the work of cognitive psychologists Amos Tversky and Daniel Kahneman. They found that humans, far from behaving as maximizers of utility, often violate this principle, doing things that are predictable and yet irrational in terms of conventional economic theory.

Dan_ArielyOne of the leading lights in this field is Dan Ariely, a behavioral economist on the faculty at Duke (he’s teaching a free online course on behavioral economics at Coursera beginning March 25, 2013). His book Predictably Irrational provides a good overview of the field for a lay audience. Many themes emerge from this work. Thus, and as we’ve seen, people intensely dislike losing things that they once possessed. Also, they really really like it when things are free. For instance, imposing even the slightest cost on a service drastically reduces people’s use of it. That’s why Amazon.com has FREE SHIPPING. It’s really not free, the cost getting factored in elsewhere, but it makes an enormous different to how free people feel about ordering from Amazon when the shipping is, ostensibly, free.

Behavioral economics is a vast subject and we’ll only touch on it here. But in touching on it, let’s focus on a place where it can be abused. People behave irrationally in predictable ways. The irrationality here is gauged in terms of people’s stated self-interest. Self-interest says that we should strive for this and avoid that. And yet, by manipulating people’s perceptions, it’s possible to get people to act against their self-interest and in the interests of the manipulator, that is, the person who understands behavioral economics and is able to exploit it against people.

Consider the use of decoys in marketing. People don’t like to buy something if it’s the only one of a kind on the market. It just doesn’t feel right. If there’s only one, how can it be any good? This happened with bread machines a few decades ago. Someone had the bright idea of making machines dedicated exclusively to making bread. But initially the idea didn’t take off — not many people were buying the machines. Until, that is, a marketer who knew something of behavioral economics (whether under that rubric or self-taught) that the way to sell these machines was to market a second machine. By producing two machines and making one clearly a better deal than the other, the manufacturer wasted and made no profit on one of the machines, but recouped the loss on the other, which then were selling like hot cakes.

This is how decoys work. They form a prime example of that subdiscipline of behavioral economics known as neuromarketing. Decoys allow for a comparison between the decoy and the target brand. Manufacturers lose money on the decoy, but they make it back by the extra sales that the decoy drives to the target brand. To make the target brand obviously better than the decoy, the target will be cheaper and of better quality than the decoy. Who in his/her right mind would buy the decoy? But that’s not the point. The point is that by drawing a comparison with the target and putting it in a better light, massively increased sales of the target can be generated.

We see a variation of this when the Wall Street Journal sells a hardcopy subscription for MORE than a combination of hardcopy and online access. Who in his/her right mind would not take both the hardcopy and the online access over mere hardcopy, especially when the combination is cheaper? But doing so drives sales to the item the Wall Street Journal really wants to sell, namely, the combination, which allows for users also to be hit with online advertising.

Is there anything morally wrong with such decoys? Yes and no. Obviously there is no physical coercion here. People are free to buy what they want and don’t have to buy anything at all. And yet, there is clearly some manipulation going on here. We are being manipulated and we’re not being told that we’re being manipulated. WARNING: THIS IS A DECOY BRAND; YOU ARE PROBABLY BUYING THIS BRAND BECAUSE THE MANUFACTURER IS ALSO MARKETING AN INFERIOR BRAND AT A HIGHER PRICE. Such warnings exist on no products. Yet the truth is that some products are marketed in precisely this way.

Decoys of this sort are part of a wider move by behavioral economists to shape our environments to get us to do not what we consciously think is in our best interest but to get us to do unconsciously what others, the powers that be, regard as in our best interest. Take Richard Thaler and Cass Sunstein’s Nudge: Improving Decisions about Health, Wealth, and Happiness. In this book they describe how the environment may be shaped to give people all the options they’ve previously had, and yet in ways that radically changes people’s behavior. For instance, in a cafeteria, if you want kids to eat more salads and less sweets, put the salads front and center and make the sweets hard to find.

Thaler and Sunstein call this approach “libertarian paternalism.” Paternalism always indicates that someone else knows what’s better for you than you do yourself. The adjective libertarian in front of paternalism indicates that the paternalism is exercised in a non-coercive rather than coercive form. But paternalism is paternalism. It suggests that we’re too stupid to know what’s best for us, or that even if we know better, we lack the self-control to do the right thing. Now that may be, but should we be manipulated into doing what’s right?

Or would it be better, whenever behavioral economists try to manipulate our actions, that we be informed that this is exactly what they are doing? Should truth in advertising include how behavioral economists are influencing our actions and decision making? In an age of over-regulation, that’s perhaps asking for too much. Perhaps a Ralph Nader of behavioral economics will arise who will call manipulative behavioral economists on the carpet as a public service whenever they try to put one over on the rest of us.

Behavioral economics points up that humans have an irrational side. But the fact that we have a rational side that can reflect on our irrational side and appreciate when we are being had suggests that behavioral economics is best played with cards on the table. People don’t like being manipulated, even for their supposed benefit. If behavioral economists want to maintain good will for their discipline and long-term influence, it is probably in their interest to be completely up front with what they’re doing. They might therefore want to explan their “libertarian paternalism” to an “open source libertarian paternalism,” which in line with the open source movement in computer software hides nothing about what it is doing.

Interestingly, behavioral economics tends to work even when people know that it is being used on them. For instance, contrary to popular conception, recent research has shown that placebos work even people know that they are placebos (ref). So being up front about its use is probably not going to limit its effectiveness. But it can prevent behavioral economics from getting a bad name as people resent its intrusion without their knowledge.

Vaccines and Autism — What Might the Numbers Be Saying?

While autism rates in the United States have exploded over the last 50 years, autism’s prevalence is still small enough that in most nuclear families children are not affected by this disorder. Still, the prevalence is now sufficiently high that most people know some family affected by this disorder. This was not the even case 20 years ago.

Is it fair to say that autism rates have in fact “exploded”? Look at the table featured at the start of this piece,[1] and you’ll see that autism rates have about doubled in the 8 years from 2000 to 2008. Moreover, they’ve gone up over 20-fold since the 1960s.[2] Here’s a brief summary of autism rates in the U.S. over the last fifty years:

  • 113 per 10,000 in 2008
  • 67 per 10,000 in 2000
  • 35 per 10,000 in the mid 1990s
  • 10 per 10,000 in the 1980s
  • 5 per 10,000 in the 1960s and 1970s [Read more…]

What are you going to believe, me or the numbers?

In the classic Marx Brothers comedy DUCK SOUP, Groucho finds Chico with Groucho’s girlfriend in a bedroom. Chico tries to deny any hanky-panky by asking “What are you going to believe, me or your own eyes?”

Obviously, Groucho should believe his own eyes and not Chico’s baldfaced lies. And yet, in many circumstances of life, we tend to believe not our eyes but what we wish to be true based on our own predilections and pet ideologies.

Economics, for instance, is a field rife with ideology. As an academic discipline, economics can afford to consider different approaches and make different presuppositions about how best to run an economy.

But when it comes to economic policy that will impact people across the society, it’s best to knock ideology down a few notches and try to assess, in as accurate numerical terms as possible, what the practical outworkings of a policy are really going to be.

Or course, this can be easier said than done. With a policy, it’s possible to measure its consequences as they unfurl over time. But it’s much more difficult to forecast accurately what’s going to happen down the pike. And it’s even more difficult to assess counterfactuals such as what would have happened had a different policy been put into effect.  [Read more…]

Hurricane Sandy, Rising Oceans, and Global Warming — What Are the Numbers Really Saying?

Debates in our culture are increasingly polarized. This is especially evident in the debate over global warming. Thus, those who think man-made or anthropogenic global warming is real and destructive take one side. On the other side are the global warming skeptics, who think that humanity’s contribution to global warming is negligible and that our efforts for improving the planet are best channeled in other directions.

In all such discussions a moral element looms large. Those who think man-made global warming is real and bad think it is very bad indeed and that those who are skeptical of it have no love for the planet, making them not just mistaken in their understanding of climate science but also bad people. On the flip side, those who are skeptical of humanity’s role in global warming tend to view their opposites as fanatics and opportunists seeking some place to invest their moral energies and going on a power trip in which they become the saviors of humanity by turning back the dreaded scourge of man-made global warming.

In such sharply polarized situations, two sides typically speak past each other, with no real meeting of minds. For people who want clarity in the matter, it would therefore be helpful to chart some third position that attempts to see things dispassionately. Looking at the actual numbers connected with the global warming debate is perhaps the safest and cleanest way to do this. Thus, in place of global warming advocates and global warming skeptics, perhaps a third group can take the stage, namely, global warming analysts (there may be better ways of describing this group, but let’s go with this for now).

Instead of getting into it over the presumed negative consequences of global warming or the presumed negative consequences of taking political action to counter global warming, the global warming analyst asks the hard number questions that get to the heart of any claim about global warming that is being made. Numbers are always at the heart of this debate, and calculating them can provide clarity and insight without fanning the moral outrage that constantly infects this debate.

Take Hurricane Sandy, which in the last days has devastated the East Coast of the United States. Yesterday evening (30Oct2012), onMSNBC’s Hardball, Chris Matthews interviewed Princeton University professor Michael Oppenheimer, who has joint appointments in geosciences and politics (such joint appointments themselves underscore the politicization of the global warming debate). In the interview, Oppenheimer made the point that sea levels have been rising on account of global warming and that this made the devastation to New York City worse than it would otherwise have been. Chris Matthews accepted this claim by Oppenheimer and moved on.  [Read more…]

The Amazing History of Information Storage: How Small Has Become Beautiful

People have been storing information since the stone ages, ever since they’ve been writing or putting art on tablets and walls. With the invention of paper and ink, the “density of information” increased significantly, packing a lot more information into a tighter space (such scrolls and eventually bound books, as we still use today).

The invention of printing didn’t substantially increase the density of information, though it greatly contributed to its dissemination by making information easier to copy. In the 20th century, the benchmark for a sizable chunk of information became the Encyclopedia Britannica.

The 2010 edition (the last print-edition that Encyclopedia Britannica will ever publish) consists of 32 volumes and weighs 129 pounds. (Source) At 50 million words or about 300 million characters, it requires roughly gigabyte to store the text electronically (leaving out images and diagrams).

In an age of thumb drives that weigh less than an ounce and that, these days, routinely hold 8 or 16 gigabytes, paper and ink doesn’t seem like a very efficient way to store information. But electronic storage wasn’t always so efficient.

Have a look at the following picture, taken in 1956. What is being taken out of that old Pan Am airliner?

What you see here is the hard drive for IBM’s 305 RAMAC computer. That hard drive, weighing over 2,000 pounds, stored a whopping 5 megabytes (not gigabytes!). It would take 200 of these to store the Encyclopedia Britannica.

The 350 Disk File consisted of a stack of fifty 24-inch discs… The capacity of the entire disk file was 5 million 7-bit characters, which works out to about 4.4 megabytes in modern parlance. This is about the same capacity as the first personal computer hard drives that appeared in the early 1980s, but was an enormous capacity for 1956. IBM leased the 350 Disk File for a $35,000 annual fee. (Source)

Note that this hard drive cost $35,000 in 1956 dollars not to buy but to rent. According to the consumer price index, $35,000 back then was worth more than $295,000 in 2012 buying power.

Of course, Moore’s law, which says that computational power doubles every 18 months, guaranteed that both cost and information density would come tumbling down. By 1980, a five megabyte Seagate hard drive could sit comfortably on your desktop and cost you only $5,000 (that is, $12,600 in 2012 buying power). A 1 gigabyte hard drive in 1980 weighed 500 pounds. (Source)

By 1987, your standard Macintosh computer was coming with 20 megabytes of storage. By 1990, you could, for a few hundred dollars, swap out those 20 megabytes for a then “large” 100 megabyte hard drive. And by 1996, Macs were coming with a one or two gigabytes of storage.

So, by 1996, the standard hard drives of desktop computers could finally hold the full Encyclopedia Britannica.

Information on hard drives is stored on a two-dimensional surface, so it makes sense to ask how many bits or bytes are stored per square inch, centimeter, or any other unit of length.

The information density of IBM’s 305 RAMAC came to 250 bytes per square inch. Compact disks (or CDs), which have been popular since the 1980s and are also a two-dimensional storage medium, have an information density of about 1 gigabyte per square inch.

This level of density for CDs may see a bit generous given that a standard CD is over 4 inches in diameter and contains less than a gigabyte of information. But the tracks on a CD are about three times as far apart as the width of a track (the pits that make up a track are about half a micrometer in width), so this measure of CD information density omits a lot of the unused space on the CD.

DVDs, by using smaller pits and tighter packing of tracks, have an information density of slightly more than 2 gigabytes per square inch. And Blu-ray can take information density up to 12 gigabytes per square inch.

Commercial hard drives have increased their information density well beyond this, and are now in the several 100s of gigabytes per square inch.

Flash memory, because it’s solid state and allows the possibility of 3-D stacking, has an information density measured in nanometers. Logic gates that store information for flash memory are now around 20 nanometers (recall that the pits that store the memory in a CD are .5 micrometers in width, or 500 nanometers).

While all this increase in density of information storage for commercially available devices is impressive, we can do much better. For instance, cellular life has developed highly dense information storage. The DNA inside any human consists of roughly 1 gigabyte of information (3 billion nucleotide base pairs, so this is the right order of magnitude).

The DNA double-helix is 2 nanometers in diameter and rises 3.4 nanometers at each winding, also adding roughly 10 nucleotides at each winding. Do the math, and one finds that DNA stores information at a density of about 10^18, or a billion billion, bytes per cubic millimeter (i.e., 1 exabyte per mm^3). Because 1 inch is 25.4 millimeters, that’s a density of roughly 10^22, or ten billion trillion, bytes per cubic millimeter (i.e., 10 zettabytes per mm^3).

Commercial electronic devices therefore don’t even come close to the information storage in the DNA of biological systems. A 1 gigabyte memory component of thumb drive made out of not flash but DNA memory would only need a diameter of 10 micrometers (human hair is between 17 and 180 micrometers in diameter; 1000 micrometers make a millimeter). The DNA from every species that has ever existed over the billions of years of evolution could be fit into a teaspoon.

As impressive as the information density of DNA is, physics allows for information storage at a much greater density still. DNA is a complex molecule and even the nucleotide bases that store genetic information are reasonably complex molecules, composed of numerous atoms. DNA therefore requires numerous atoms for each byte of information it stores.

But it is also possible to store information at the atomic level. Using dipolar coupled spins for quantum memory storage, researchers have been able to store 1,024 bits of data on a single 5CB (C18 H 19 N) molecule, which is roughly 27 bits per atom or slightly over 3 bytes per atom. That’s a significantly higher density of information storage than found in DNA. (Source)

But scientists have done even better than this. In 1959, physicist Richard Feynman offered a $1,000 prize for anyone who could shrink a page to 1/25,000 its size. Since a page is two dimensional, such a shrinkage would constitute an information density reduction by a factor of over 600 million, and could write the Encyclopedia Britannica on the head of a pin.

Not until 1985 did Feynman finally have to pay out his prize money. It went to researchers at Stanford who used “electron beam lithography to engrave the opening page of Dickens’A Tale of Two Citiesin such small print that it could be read only with an electron microscope.” (Source)

This record held until 1990, when researchers at IBM arranged 35 xenon atoms to form the IBM logo.

Finally, in 2009, researchers, back at Stanford, where able to improve on IBM’s miniaturization, making it 40 times smaller. Using electronic quantum holography, they were able to store

35 bits per electron to encode each letter [and] write the letters so small that the bits that comprise them are subatomic in size. So one bit per atom is no longer the limit for information density. There’s a grand new horizon below that, in the subatomic regime. Indeed, there’s even more room at the bottom than we ever imagined. (Source)

To sum up, in 1956, a 5-megabyte IBM hard drive weighed over a ton, or roughly 1,000 kilograms. In other words, that hard drive stored information at a density of 5 bytes per gram.

In 2009, Stanford researchers were able to store information at a density of 35 bits per electron, or roughly 4 bytes per electron. Since an electron weighs roughly 1 in 10^27 grams (i.e., one part in a thousand trillion trillion grams), that means the ultimate information density discovered to date weighs in (literally) at four thousand trillion trillion bytes, or 4 brontobytes (see appendix), per gram.

That makes our commercially available terabyte and petabyte storage devices seem crude.

APPENDIX: Disk Storage Reference

· 1 Bit = Binary Digit
· 8 Bits = 1 Byte
· 1000 Bytes = 1 Kilobyte= 10^3 Bytes
· 1000 Kilobytes = 1 Megabyte= 10^6 Bytes
· 1000 Megabytes = 1 Gigabyte= 10^9 Bytes
· 1000 Gigabytes = 1 Terabyte= 10^12 Bytes
· 1000 Terabytes = 1 Petabyte= 10^15 Bytes
· 1000 Petabytes = 1 Exabyte= 10^18 Bytes
· 1000 Exabytes = 1 Zettabyte= 10^21 Bytes
· 1000 Zettabytes = 1 Yottabyte= 10^24 Bytes
· 1000 Yottabytes = 1 Brontobyte= 10^27 Bytes
· 1000 Brontobytes = 1 Geopbyte= 10^30 Bytes