backtop


Print 42 comment(s) - last by mindless1.. on Jan 31 at 11:47 PM

Intel and Nanochip team up to develop 100 gigabytes per chip

This article was syndicated from Tiago Marques' blog at SiliconMadness.com

Nanochip, Inc., a Silicon Valley startup, has managed to raise $14 million in funding from Intel Capital, Intel's global investment organization, for further development of the MEMS technology.

You read it right: gigabytes, not gigabits.

According to Nanochip (PDF), the technology isn't lithography constrained, allowing production of chips of more than 1GB in capacity, in plants that have already been deemed outdated by current standards.

The lack of lithography constraints means cheaper products, resulting in an opportunity to also replace flash memory, as the technology is also non-volatile.

Today's factories should be able to produce the first products, estimated at 100GB per chip, when the technology is expected to be unleashed for public consumption by 2010. The first samples will be available during 2009.

PRAM, or phase change memory, was expected to be the technology to replace flash in the coming years, since it is also non-volatile, while it is much faster than flash. As Intel found out over the last few years, PRAM doesn't seemed to scale so well, in regards to density, and still has some boundaries to overcome -- namely it's thermal principles of operation.

Nanochip's details of the technology are ambiguous at best, though what is known is that the company is working on hybrid micro-electric-mechanical -- the partnership with Intel suggests a PRAM connection too.  The company has described this as a very small platter coated in chalcogenide is dragged beneath hundreds of thousands of electrical probes which can read and write the chalcogenide.  Casual estimates put this sort of density at one terabit per square inch, or 125GB per square inch. 

The company has not disclosed access speeds. That's a place where PRAM is appointed to be the undisputed king of the hill, so it could limit applications of this type of technology.

For now it seems that the flash SSD drives are going to be replaced before they even reach mass consumption -- which is a good thing. The technology is expensive, doesn't provide a lot of storage space and is prone to failure, due to the low amount of write cycles available per cell. Flash is perfect for pendrives and resisting shock, not so good for regular, intensive, HDD usage.


Comments     Threshold


This article is over a month old, voting and posting comments is disabled

RE: Which is it?
By Assimilator87 on 1/28/2008 1:51:08 PM , Rating: 3
I'm pretty sure Seagate recently had a class action lawsuite filed against them for misrepresenting the actual size of the HDDs. Only a few GB? The consumers care.


RE: Which is it?
By onwisconsin on 1/28/2008 2:15:07 PM , Rating: 2
Or lawyers...


RE: Which is it?
By omnicronx on 1/28/2008 2:54:03 PM , Rating: 2
quote:
I'm pretty sure Seagate recently had a class action lawsuit filed against them for misrepresenting the actual size of the HDDs. Only a few GB? The consumers care.

What would we ever do without our extra 48 576 bytes per MB ;)
HD manufacturers purposely did this when HD sizes where small to not confuse consumers (made it easy 1MB = 1000 kilobytes).

Although it seems like manufacturers are trying to make every extra penny, the truth is probably that the people who first thought up of this plan, did not take into account how far HD space would actually scale. The bigger the hard drive gets, the more times you lose those extra 48 576 bytes ;)

I never really understood how somewhere around 50G lost every 1TB warranted a class action suit..


RE: Which is it?
By HaZaRd2K6 on 1/28/2008 3:37:19 PM , Rating: 5
The reason that whole situation existed at all was because of the different ways drive manufacturers and software programmers measure a gigabyte. A gigabyte, technically speaking, is exactly one billion (10^9) bytes. This is what drive manufacturers put on their drives. What software programmers call one gigabyte is actually one gibibyte (giga binary byte or GiB, 2^30).

An example, take a hard drive that can store exactly 250×10^9 or 250 billion bytes after formatting. Generally, operating systems calculate disk and file sizes using binary numbers, so this 250 GB drive would be reported as "232.83 GB". The result is that there is a significant discrepancy between what the consumer believes they have purchased and what their operating system says they have. So when Seagate says you have a 500GB drive, they're right. You have five hundred billion bytes of storage on that drive. Just because Microsoft measures five hundred billion bytes as binary bytes does not mean that there are differences in the actual, physical storage capacity of the drive.


RE: Which is it?
By mindless1 on 1/28/2008 6:43:44 PM , Rating: 1
Wrong. It's not just software programmers, it's everyone else, the hardware manufactureres ratings of bus speeds, chip capacities (including memory). It was also hard drive manufacturers! Did you get that last sentence? Hard drive manufacturers did correctly state capacity early on, then changed to the present capacity mislabeling.

A gigabyte technically speaking is not 10^9 bytes. Understand what a prefix is, it is to describe the suffix which is a binary not decimal value.

The fact is, the entire computer industry and scientists do accept the binary number system. Only ignorant people who can't understand that there is more than one number system think that a prefix can only exist in the decimal system or has to have a decimal value, that it couldn't be a binary value.

The entire computer industry defined what gigabit, gigabyte, megabit, etc, meant and it was a standardized fixed term BEFORE certain companies misused the term trying to make it a decimal value. That's the whole point of a standardized term, that it doesn't change even if the hard drive industry or a confused poster like yourself really really want it to change.


RE: Which is it?
By chekk on 1/28/2008 8:07:36 PM , Rating: 2
Wow, did get up on the wrong side of your cage this morning? There's no need to be nasty.
quote:
Understand what a prefix is, it is to describe the suffix which is a binary not decimal value.

Exactly. Which means that whether one is referring to decimal values, binary values, apples or whelks, the giga prefix means 10^9.


RE: Which is it?
By chekk on 1/28/2008 8:10:44 PM , Rating: 2
Crap. Apparently, I can't proofread.
"... did you get up ..."


RE: Which is it?
By the goat on 1/29/2008 8:57:20 AM , Rating: 2
quote:
Exactly. Which means that whether one is referring to decimal values, binary values, apples or whelks, the giga prefix means 10^9.


You sir are 100% wrong. The these prefixes are based on accent Greek and Latin languages. Nobody owns these Greek and Latin prefixes. The "International System" (SI for short) system of measurement (meters, liters, etc.) uses these prefixes to represent powers of ten (i.e. giga = 10^9). This is true for every SI unit of measure.

But it is very important to realize that bits, nibbles and bytes are not SI units. Therefore SI has no authority to dictate how accent Greek and Latin words used as prefixes modify these computer memory units of measure.

Everybody in the computer memory industry knows how to use prefixes on binary units correctly. There is no dispute as to what a kilobyte, etc. is defined as (hint: kilobyte = 2^10 = 1024 bytes, not more, not less).


RE: Which is it?
By grv on 1/29/2008 9:19:09 AM , Rating: 2
just because a couple of uneducated programmers misused decimal prefixes doesn't mean its somehow right or "standard".
learn, binary prefixes are standardized since 1998: http://physics.nist.gov/cuu/Units/binary.html
and don't forget to ask your money back for IT education. stupid teachers deserve no money


RE: Which is it?
By the goat on 1/29/2008 2:51:25 PM , Rating: 2
quote:
just because a couple of uneducated programmers misused decimal prefixes doesn't mean its somehow right or "standard". learn, binary prefixes are standardized since 1998: http://physics.nist.gov/cuu/Units/binary.html
and don't forget to ask your money back for IT education. stupid teachers deserve no money

The units for measuring computer memory were defined way before 1998. The alternative binary only prefixes you are talking about are a joke. Nobody uses them.

Like I said before the prefixes in question are based on ancient Greek and Latin. They are not misused by the computer memory industry. They are not tied to decimal units only.

SI gets to make up prefixes for all the units of measure they invent. But SI didn't invent bytes and bits. So why does anybody think SI should dictate how prefixes work with bits and bytes?


RE: Which is it?
By mindless1 on 1/31/2008 11:42:17 PM , Rating: 2
I think you mean the entire computer industry, not just a couple of programmers. Obviously you have no arguement if you choose to ignore such a basic fact that the computer industry did standardize the terms gigabyte, etc, to mean a value in the binary system. They DEFINED what giga, mega, etc, meant in the binary system. 10^9, etc, is in the decimal system not the binary system.

I'd have to agree with the goat, in that it's laughable you reference 1998, what about 30 years earlier?

The problem is simple, you don't know what standardization is or why it's important.


RE: Which is it?
By HaZaRd2K6 on 1/29/2008 2:50:59 PM , Rating: 2
Now that's where you're wrong.

The Système International prefixes are based on ancient Greek. Specifically, the prefix "giga" refers to billion (or ten to the power of nine or 1,000,000,000). The Système International works in base ten, yes, hence giga is base ten.

Computer programmers and component manufacturers turned this definition upside down, made it two to the power of thirty and called it a gigabyte. Two to the power of thirty, as I mentioned before in this thread is 1,073,741,824. Not one billion. In base two (binary), gibibyte is the correct definition of one billion binary bytes. Gigabyte refers to exactly one billion bytes--no more, no less. Gibibyte refers to exactly 1,073,741,824 bytes.

So stop it. You're wrong. Gigabyte means one billion bytes. Gibibyte means 1,073,741,824 bytes. Now you can either tell programmers to start coding in base ten, tell drive manufacturers to start listing drive sizes in base two or just put up with it and realise no matter what happens, neither side is going to give up.


RE: Which is it?
By the goat on 1/29/2008 3:02:30 PM , Rating: 2
quote:
The Système International prefixes are based on ancient Greek. Specifically, the prefix "giga" refers to billion (or ten to the power of nine or 1,000,000,000).

Incorrect in so many ways. First of all the word Giga is Latin not Greek and it means giant not 1,000,000,000.

Mega = Great (Greek)
Tera = Monster (Greek)

What do the words giant, great and monster have to do with base ten?

quote:
The Système International works in base ten, yes, hence giga is base ten.

SI is base ten no doubt. But Bytes and bits are not SI units. So why should I case about SI prefixes?


RE: Which is it?
By HaZaRd2K6 on 1/29/2008 11:09:38 PM , Rating: 2
From Merriam-Webster dictionary:

Giga
Etymology: International Scientific Vocabulary, from Greek gigas giant: billion (10^9) <gigahertz> <gigawatt>


Notice words number two through four there? International Scientific Vocabulary . In other words, your opinion counts for nothing. And if you still want to argue, take it up with the National Institute of Standards and Technology. Here, I'll even give you the link to the .pdf: http://physics.nist.gov/cuu/pdf/sp811.pdf

You done now? I am Greek. I know the word. The word itself (???a?, actually pronounced "gigas" (soft 'g')) does mean giant, yes, but gigabytes and gigahertz and gigawatts are not giantbytes and gianthertz and giantwatts. They're billions. Same as a terabyte and a terahertz and a terawatt are not monsterbytes and monsterhertz and monsterwatts. Those are trillions. Stop confusing standard convention in the scientific community with the etymology of ancient Greek words.

quote:
SI is base ten no doubt. But Bytes and bits are not SI units. So why should I case about SI prefixes?

Nobody was ever saying bytes and bits were SI units. We were saying the prefixes used to describe them in quantity (including kilo, mega, giga and tera) are SI prefixes and are attached to specific quantities.


RE: Which is it?
By the goat on 1/30/2008 8:31:35 AM , Rating: 2
quote:
From Merriam-Webster dictionary:

Giga
Etymology: International Scientific Vocabulary, from Greek gigas giant: billion (10^9) <gigahertz> <gigawatt>

Notice words number two through four there? International Scientific Vocabulary . In other words, your opinion counts for nothing. And if you still want to argue, take it up with the National Institute of Standards and Technology. Here, I'll even give you the link to the .pdf: http://physics.nist.gov/cuu/pdf/sp811.pdf

The definitions from Merriam-Webster and from NIST and anywhere else you can find are all taken from the SI definition. So it doesn't add any more weight to your argument.
quote:
You done now? I am Greek. I know the word. The word itself (???a?, actually pronounced "gigas" (soft 'g')) does mean giant, yes, but gigabytes and gigahertz and gigawatts are not giantbytes and gianthertz and giantwatts. They're billions. Same as a terabyte and a terahertz and a terawatt are not monsterbytes and monsterhertz and monsterwatts. Those are trillions. Stop confusing standard convention in the scientific community with the etymology of ancient Greek words.

Nobody was ever saying bytes and bits were SI units. We were saying the prefixes used to describe them in quantity (including kilo, mega, giga and tera) are SI prefixes and are attached to specific quantities.

If you are Greek why did you say the word giga was Latin?

You seem to have missed me point. SI does not own any of the prefixes they use. SI took/stole their prefixes from other languages. SI is not dictated to us by the God-Emperor on Arrakis. So guess what other people besides SI are allowed to invent units of measure and define prefixes to use with those non-SI units. The definition of kilobyte = 1024, megabyte (1024)^2, terabyte (1024)^3, etc. have been in the popular lexicon for close to 40 years. That has quite a bit of weight. If is hard to now say, "wait a second, you are using the wrong definition because I never gave you permission to define your units." Why should anybody ask permission from SI or anybody else before using words and units of measure that have been defined for several decades?

Let me point out a direct analogue to this argument. This is an example of one system of measure taking a word defined by another system of measure and redefining the word for their own use. Does the fact that the word now has many unequal definitions make one or the other definition more or less valid? The example I am talking about is the word "ton". The word ton means different things in different systems of measurement. In the USA ton = 2000lbs. in the imperial system of measure. In the UK ton = 2400lbs. in the imperial system of measure. In the SI system of measure ton = 1000kg = ~2205lbs. Which one is correct. Did SI illegally steel the word ton from the imperial system? Of coarse all are equally correct. The same as 1 kilogram = 1000 grams and 1 kilobyte = 1024 bytes. SI borrowed the word ton just like the computer memory industry borrowed prefixes from Greek and Latin. Nobody owns the word "ton". Nobody owns the prefix kilo, mega, tera, etc.


RE: Which is it?
By HaZaRd2K6 on 1/30/2008 10:10:23 PM , Rating: 2
quote:
First of all the word Giga is Latin not Greek...
quote:
If you are Greek why did you say the word giga was Latin?

If you go back and read this back-and-forth dialogue, you'll actually discover you said giga was Latin, not me.

And I refuse to keep this going any longer. My point is that using SI prefixes to determine values that are not SI standards is where confusion arises. The drive manufacturers use SI prefixes as exactly what they are, but programmers define standard SI prefixes somewhat differently. Whether or not they actually are SI prefixes is besides the point. Most people take the prefix "giga" to mean "billion". It's really that simple.


RE: Which is it?
By mindless1 on 1/31/2008 11:47:24 PM , Rating: 2
It's real simple: It makes no difference at all if it's Greek, Latin, or even if the term was "dogfood" instead of "giga". Literally, if the industry wanted to use the term dogfoodbytes instead, once it was a standard it doesn't matter that elsewhere dogfood comes in a bag and canines eat it.

What matters is that the entire industry standardize a term, it is irrelevant if that term means something else in another discipline before, during, or afterwards. The computer industry has clearly established the value of these terms and anyone who tries to pretend they're smart by declaring a standard term is invalid because some 3rd party says so decades after it was standardized, is fooling themselves.


RE: Which is it?
By MrPoletski on 1/29/2008 2:24:37 AM , Rating: 2
It comes down to the question:

Is one gigabyte 2^30 bytes or is it 10^9 bytes?

I, personally, prefer the 2^30 idea.

But, most people don't even know what a base-2 number system is. In fact, there are 2 types of people, those who do and those who don't.

When they see the number written in base-2 (BINARY for those who don't) it looks smaller because 2^30 > 3^10.

I think HDD manufacturers should list their capacities in gigabits but use a base-8 (OCTAL, for those 7 of you who don't) number system so people actually bother to learn about these simple things.


RE: Which is it?
By mallums on 1/29/2008 2:47:24 AM , Rating: 2
I think you meant 10 types of people. :)


RE: Which is it?
By HaZaRd2K6 on 1/30/2008 10:12:51 PM , Rating: 2
quote:
I think you meant 10 types of people. :)

I agree with ya there ;-)

And I think gigabyte should be 10^9. Seeing as giga is an SI prefix (as much as thegoat tries to say otherwise), it would only make sense. Calling it a gibibyte might sound a little weird, but technically it's correct.


"This week I got an iPhone. This weekend I got four chargers so I can keep it charged everywhere I go and a land line so I can actually make phone calls." -- Facebook CEO Mark Zuckerberg

Related Articles













botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki