Print 42 comment(s) - last by mindless1.. on Jan 31 at 11:47 PM

Intel and Nanochip team up to develop 100 gigabytes per chip

This article was syndicated from Tiago Marques' blog at

Nanochip, Inc., a Silicon Valley startup, has managed to raise $14 million in funding from Intel Capital, Intel's global investment organization, for further development of the MEMS technology.

You read it right: gigabytes, not gigabits.

According to Nanochip (PDF), the technology isn't lithography constrained, allowing production of chips of more than 1GB in capacity, in plants that have already been deemed outdated by current standards.

The lack of lithography constraints means cheaper products, resulting in an opportunity to also replace flash memory, as the technology is also non-volatile.

Today's factories should be able to produce the first products, estimated at 100GB per chip, when the technology is expected to be unleashed for public consumption by 2010. The first samples will be available during 2009.

PRAM, or phase change memory, was expected to be the technology to replace flash in the coming years, since it is also non-volatile, while it is much faster than flash. As Intel found out over the last few years, PRAM doesn't seemed to scale so well, in regards to density, and still has some boundaries to overcome -- namely it's thermal principles of operation.

Nanochip's details of the technology are ambiguous at best, though what is known is that the company is working on hybrid micro-electric-mechanical -- the partnership with Intel suggests a PRAM connection too.  The company has described this as a very small platter coated in chalcogenide is dragged beneath hundreds of thousands of electrical probes which can read and write the chalcogenide.  Casual estimates put this sort of density at one terabit per square inch, or 125GB per square inch. 

The company has not disclosed access speeds. That's a place where PRAM is appointed to be the undisputed king of the hill, so it could limit applications of this type of technology.

For now it seems that the flash SSD drives are going to be replaced before they even reach mass consumption -- which is a good thing. The technology is expensive, doesn't provide a lot of storage space and is prone to failure, due to the low amount of write cycles available per cell. Flash is perfect for pendrives and resisting shock, not so good for regular, intensive, HDD usage.

Comments     Threshold

This article is over a month old, voting and posting comments is disabled

RE: Which is it?
By the goat on 1/30/2008 8:31:35 AM , Rating: 2
From Merriam-Webster dictionary:

Etymology: International Scientific Vocabulary, from Greek gigas giant: billion (10^9) <gigahertz> <gigawatt>

Notice words number two through four there? International Scientific Vocabulary . In other words, your opinion counts for nothing. And if you still want to argue, take it up with the National Institute of Standards and Technology. Here, I'll even give you the link to the .pdf:

The definitions from Merriam-Webster and from NIST and anywhere else you can find are all taken from the SI definition. So it doesn't add any more weight to your argument.
You done now? I am Greek. I know the word. The word itself (???a?, actually pronounced "gigas" (soft 'g')) does mean giant, yes, but gigabytes and gigahertz and gigawatts are not giantbytes and gianthertz and giantwatts. They're billions. Same as a terabyte and a terahertz and a terawatt are not monsterbytes and monsterhertz and monsterwatts. Those are trillions. Stop confusing standard convention in the scientific community with the etymology of ancient Greek words.

Nobody was ever saying bytes and bits were SI units. We were saying the prefixes used to describe them in quantity (including kilo, mega, giga and tera) are SI prefixes and are attached to specific quantities.

If you are Greek why did you say the word giga was Latin?

You seem to have missed me point. SI does not own any of the prefixes they use. SI took/stole their prefixes from other languages. SI is not dictated to us by the God-Emperor on Arrakis. So guess what other people besides SI are allowed to invent units of measure and define prefixes to use with those non-SI units. The definition of kilobyte = 1024, megabyte (1024)^2, terabyte (1024)^3, etc. have been in the popular lexicon for close to 40 years. That has quite a bit of weight. If is hard to now say, "wait a second, you are using the wrong definition because I never gave you permission to define your units." Why should anybody ask permission from SI or anybody else before using words and units of measure that have been defined for several decades?

Let me point out a direct analogue to this argument. This is an example of one system of measure taking a word defined by another system of measure and redefining the word for their own use. Does the fact that the word now has many unequal definitions make one or the other definition more or less valid? The example I am talking about is the word "ton". The word ton means different things in different systems of measurement. In the USA ton = 2000lbs. in the imperial system of measure. In the UK ton = 2400lbs. in the imperial system of measure. In the SI system of measure ton = 1000kg = ~2205lbs. Which one is correct. Did SI illegally steel the word ton from the imperial system? Of coarse all are equally correct. The same as 1 kilogram = 1000 grams and 1 kilobyte = 1024 bytes. SI borrowed the word ton just like the computer memory industry borrowed prefixes from Greek and Latin. Nobody owns the word "ton". Nobody owns the prefix kilo, mega, tera, etc.

RE: Which is it?
By HaZaRd2K6 on 1/30/2008 10:10:23 PM , Rating: 2
First of all the word Giga is Latin not Greek...
If you are Greek why did you say the word giga was Latin?

If you go back and read this back-and-forth dialogue, you'll actually discover you said giga was Latin, not me.

And I refuse to keep this going any longer. My point is that using SI prefixes to determine values that are not SI standards is where confusion arises. The drive manufacturers use SI prefixes as exactly what they are, but programmers define standard SI prefixes somewhat differently. Whether or not they actually are SI prefixes is besides the point. Most people take the prefix "giga" to mean "billion". It's really that simple.

RE: Which is it?
By mindless1 on 1/31/2008 11:47:24 PM , Rating: 2
It's real simple: It makes no difference at all if it's Greek, Latin, or even if the term was "dogfood" instead of "giga". Literally, if the industry wanted to use the term dogfoodbytes instead, once it was a standard it doesn't matter that elsewhere dogfood comes in a bag and canines eat it.

What matters is that the entire industry standardize a term, it is irrelevant if that term means something else in another discipline before, during, or afterwards. The computer industry has clearly established the value of these terms and anyone who tries to pretend they're smart by declaring a standard term is invalid because some 3rd party says so decades after it was standardized, is fooling themselves.

“And I don't know why [Apple is] acting like it’s superior. I don't even get it. What are they trying to say?” -- Bill Gates on the Mac ads

Related Articles

Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki