backtop


Print 19 comment(s) - last by MRwizard.. on May 9 at 2:24 AM

NIST may soon put the venerable photoresist etching lithography process to rest with some old-fashioned press power.

Lithography is now a ubiquitous term when it comes to integrated circuits. As lithography techniques have improved, ICs have become more densely packed with the all-important transistor. The latest processor cores from popular manufacturers AMD and Intel pack hundreds of millions of transistors into a few square centimeters.

Lithography allows chip manufacturers to etch patterns into silicon insulating material where nanoscale transistors and copper wire make up the bulk of an IC's work center. But both the physical limit of current lithography techniques as well the insulating properties of silicon are being stressed as the insulating channels grow smaller and the important transistors are squeezed even closer together.

Work with high-K metal logic gates has allowed further miniaturization, but these materials will soon be at their limits. Carbon nanotubes may replace the copper interconnects, but that won't boost the insulating properties of silicon or improve lithography further.

The National Institute of Standards and Technology has been working on a new form of lithography, called nanoimprint lithography (NIL). Rather than etching patterns into a material like most methods, NIL is, as the name implies, an embossing process. A die containing the necessary patterns is created and used to stamp the insulating material.

The actual material is important as well, as it must be malleable enough to accept the nanoscale imprint, but rigid enough to hold the shapes. Most NIL films are physically hardened after impression, using heat or ultraviolet radiation.

NIST has been using spin-on organosilicate glass (SOG) as the insulating film in their work. SOG starts as a fluid film, which is then hardened to glass using heat. SOG is also a superior insulator due to being laced with nanoscale pores. One of the reasons SOG is not presently used in ICs is that photoresist etching can damage the material, compromising its properties as an insulator. NIL on the other hand leaves the material unaffected.

Not only can NIST's NIL process successfully stamp SOG, the process itself actually makes the glass film a better insulator. Normally SOG is laced with large and small nanopores. While the small pores help increase insulation, the larger pores can interfere with it. The NIL process helps to eliminate large pores, creating the beneficial smaller ones. It also creates a dense protective skin on the surface of the film, further protecting it from external interference.

NIST's work looks promising and may provide chip manufacturers with better products involving less process and better materials than present manufacturing techniques. Combined with new interconnect materials and extremely small transistors, Moore's law might yet survive another two decades.



Comments     Threshold


This article is over a month old, voting and posting comments is disabled

Extremely Benificial
By sphyder on 5/1/2008 12:06:35 PM , Rating: 2
This technology would be a larger leap forward than most here are giving it credit for. Photo lithography is used in a hell of a lot more than cpu manufacturing. Look at a motherboard and you will see a wide range of chips that use photo/etch in their production. Open up any hard drive and you will find read/write heads that use the same process. The benefits go way beyond just computers. Think of just about any consumer electronics (set top boxes, cell phones, televisions, etc). I for one am hoping for something like this to become a reality. Imagine your cell phone having the power of your current desktop. This breakthrough is just one more step in that direction.




insulating material
By maverick85wd on 5/1/2008 4:15:34 PM , Rating: 2
quote:
NIST has been using spin-on organosilicate glass (SOG) as the insulating film in their work. SOG starts as a fluid film, which is then hardened to glass using heat. SOG is also a superior insulator due to being laced with nanoscale pores. One of the reasons SOG is not presently used in ICs is that photoresist etching can damage the material, compromising its properties as an insulator. NIL on the other hand leaves the material unaffected.


I wonder how this SOG material compares to graphene in terms of performance and potential transistor size

http://www.engadget.com/2007/12/20/princeton-resea...




Senior PC = features
By lemonadesoda on 5/2/2008 5:31:05 AM , Rating: 1
1. One button mouse... to stop the inevitable wrong mouse button problem

2. Simple cable management, one power cable, one cable from the PC to the keyboard, one cable from the keyboard to the mouse

3./ DVD and TFT built in

4./ Simple, comprehensive, software suite built in

5./ EMail account, and client, a piece-of-cake to register and set up

Hmmm, there's something like that on the market already! LOL




Computing
By unbaisedgamer on 5/1/08, Rating: -1
RE: Computing
By Macuser89 on 5/1/2008 9:46:40 AM , Rating: 1
I am reminded of bill gates when you say a current piece of hardware will be good for 20 years.

Bill gates
"640K ought to be enough for anybody"
Not sure if thats a real quote but i found it on the internet.

I am sure we will find a way to use that CPU speed in the future, or we will not make the CPU.


RE: Computing
By Bioniccrackmonk on 5/1/2008 10:29:35 AM , Rating: 2
quote:
I am sure we will find a way to use that CPU speed in the future, or we will not make the CPU.


Crysis 2, Alan Wake, etc.


RE: Computing
By FITCamaro on 5/1/2008 10:42:47 AM , Rating: 2
The vast majority of people do not play PC games. While PC gaming will continue to drive the innovation of new chips, one has to wonder how long that will last. While there will always be those who like to play games on PC, how long before developers view it as not worth the effort? PC game piracy is much more rampant than console piracy.

Games consoles will never be able to touch the capabilities of a PC, but they do seem to be where gaming is heading. I can't blame them. While yes a keyboard and mouse are more accurate, its a lot easier and more fun to be lying back on the couch with a controller playing than sitting in front of your computer. Yes you can plug your computer into your HDTV, but then you have issues of what to put the mouse on and the keyboard sliding around on your lap.


RE: Computing
By MRwizard on 5/9/2008 2:12:30 AM , Rating: 2
well, you can always have the compatible mouse and keyboard with sony and ms. anyone that complains about that solely should, should ise google a bit more


RE: Computing
By MRwizard on 5/9/2008 2:24:40 AM , Rating: 2
darn, 4 got the part about the sliding, what i do is i lie down, and the underneath of my keyboard has rubber, the same stuff as under the mouse pad and i dnt have problems of it sliding around


RE: Computing
By spluurfg on 5/1/2008 9:54:42 AM , Rating: 2
quote:
I'd prefer these companies invest more into revolutionizing computing rather than squeezing out more evolutionary technology.


Unfortunately for every successful tech revolution, there are a number of failures. Big players like Nokia and Intel have successful and well established venture capital businesses investing in precisely the sort of 'revolutionary' ideas you are thinking of. Unfortunately, such investments are very risky and typically VC firms will write off a significant amount of their invested capital, depending on those very successful home-run investments.

So they do invest in ideas that aim to revolutionize computing -- they just don't invest all their money doing so to ensure they don't go bankrupt.


RE: Computing
By kattanna on 5/1/2008 10:07:31 AM , Rating: 2
quote:
For normal business applications, a dual core CPU clocked at 2ghz per core will be more than adequate for the next 20 years at least


agreed. Heck we have pIII systems here running just fine for what they do. I can see businesses bringing in newer dual core machines, but once they do.. honestly.. you can only type so fast in word or read your email so fast in outlook.

home computer users who might watch a movie, check email, surf, or play a few mildly resource hungry games have also been well covered since oh.. 2004.

only hardcore gamers really need anything new to keep up with their hobby. for everyone else anything "new" in 2006/2007 is more then enough for them.


RE: Computing
By Cogman on 5/1/2008 10:32:05 AM , Rating: 2
Ummm, Crysis, Supreme Commander, and Fear are a few games that come to mind that weren't well covered by 2004 machines (heck Crysis isn't covered very well by today's machines). I wouldn't call someone a "Hard Core" gamer that wants to run these games, just an average consumer.

As for watching movies, slowly 2004 computers are becoming less capable to handle current media standards (H.264) That will be one of the big reasons for increasing processing power. (For whatever reason, watching a movie on a laptop is insanely popular where I live)

Now for business applications, yeah, I completely agree. Companies that spend large amounts of money for employee PCs are for the most part throwing away their money. The only change I would make from a 2000 computer is getting them an LCD rather then a CRT (Save their eyes and some power). Other then that, 500 MHz is fast enough to barely limit productivity and do just about anything the employee needs to do (Type a word document).

Heck, a system built on a mainframe concept would be plenty for most businesses. A quad core processor could easily server 100s of employees writing in word.


RE: Computing
By FITCamaro on 5/1/2008 10:47:51 AM , Rating: 3
quote:
As for watching movies, slowly 2004 computers are becoming less capable to handle current media standards (H.264)


As long as you have a PCI-E slot, a $40 video card can take care of all HD processing. You don't need a top end CPU for it any more.

And the average consumer doesn't play PC games. His statement that a computer from 2004 is adequate for most is correct. The X2 based system I built my parents last fall is probably the last computer I'll build them for a while unless they want another one. I'll pop another 2GB of RAM in it eventually but it will be able to take care of anything they'll want to do for a long time.


RE: Computing
By stilltrying on 5/1/2008 1:13:03 PM , Rating: 2
Try telling that to an Electronic Medical Records system with high level graphics. Add on virus scanning memory hog software. More and more bloated officeware, photoshop, etc... Thanks but no thanks to your PIII adequacies in my work environment.


RE: Computing
By kyleb2112 on 5/2/2008 2:51:48 AM , Rating: 2
Uh, no. There are legions of content providers who celebrate every processor upgrade. That's why when they review and benchmark CPUs, they have whole sections for 3D and video compression performance. Processing power directly affects not only my time lines, but what projects I can accept. By comparison, games aren't even multi threaded yet.


RE: Computing
By Connoisseur on 5/1/2008 10:28:36 AM , Rating: 2
Dude you're kidding right? People have been making statements like that for years. I remember my cousin saying something like: "Oh yeah, you won't need anything more than a 60mhz Pentium process for office apps". This was about 10-12 years ago. My point is that you cannot predict what sort of applications/functionality one might want or need a few years down the road. Just because you're used to applications as they are now, doesn't mean they can't evolve into something much more radical that might become a "necessity". All this work requires more processing power both on a personal level and a research or corporate level.

I for one am glad some companies are working on improving existing technologies (transistors) while others are focusing on revolutionary technologies (quantum computing) at the same time. It's the best of both worlds that ensures that progress isn't hindered just because we're waiting for the next big thing.


RE: Computing
By soydios on 5/1/2008 10:46:27 AM , Rating: 2
You're right, the current CPU power is sufficient for simple 2-Dimensional text-based computing.

But as soon as text-to-speech, speech-to-text, and high-definition video become ubiquitous, we'll need more processing power.


RE: Computing
By Reclaimer77 on 5/1/2008 11:15:24 AM , Rating: 2
quote:
For normal business applications, a dual core CPU clocked at 2ghz per core will be more than adequate for the next 20 years at least.


I'm trying to think of a way to be polite about this, but I just can't. I think statements like this are pure ignorance of the highest level.

If we were to jump 20 years from now we probably wouldn't even recognize a then " personal computer " from ours today. Room temperature Superconducting technology will be common place by then. PC's will probably be half the size of a Shuttle with 500 times more processing power then today. Heatsinks and fans will be obsolete. System ram as we know it probably won't even exist anymore, because we won't need it. " Monitors " will probably be a thick as a few sheets of paper. And mouse and keyboard ? HA ! Something MUCH more intuitive will replace that I'm sure.

The world isn't going to just stop because you can run your office ap's on obsolete technology. Sorry.


RE: Computing
By djc208 on 5/1/2008 11:26:01 AM , Rating: 2
That's why you're not seeing huge increases in processor power, you're seeing increases in processors. While the core does improve and get more efficient the big push lately has been into broader coverage (more cores, x86 in new markets, etc.).

This technology can be use to make the next uber-processor which someone will always want for graphics design CAD/CAM, modeling, servers, etc., but it can also allow an Atom-class processor to go into an iPhone like device.

Besides electronics is truely a trickle-down setup. This technology will allow the older lithography processes to move onto other chips to help shrink all the other electronics stuff you use.


"I'm an Internet expert too. It's all right to wire the industrial zone only, but there are many problems if other regions of the North are wired." -- North Korean Supreme Commander Kim Jong-il

Related Articles
Life With "Penryn"
January 27, 2007, 12:01 AM













botimage
Copyright 2015 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki