Print 9 comment(s) - last by mmatis.. on Jan 16 at 8:21 AM

Eamon is looking to create a new method for calculating structural failure probability

A researcher from Wayne State University in Detroit, Michigan recently received a three-year, $250,000 grant to create a new method for calculating structural failure probability.

The grant was awarded to Christopher Eamon, an associate professor of civil and environmental engineering at Wayne State University. The National Science Foundation provided the $250,000 grant in an effort to reach heightened consistency within a structure via the development of new methods for reliability analysis for probabilistically and computationally complex structural engineering problems.

Current methods are capable of providing accurate results via simulation, but these methods can be quite costly and time consuming for complex problems, which can sometimes require several simulations to run an analysis. In fact, a method called the Monte Carlo simulation can take over one million simulations. Eamon said these kinds of analyses can take hours or days to be completed. With such complex problems, introducing uncertainty analysis as well can make repeating the analysis several times challenging.

Eamon is looking to change this by creating a method that is as accurate as the Monte Carlo simulation, but only requires about 1,000 computations. With a less expensive technique that is also less time consuming yet equally accurate, engineers will be able to determine the safety levels of structures, thus refraining from inconsistencies.

"If you don't get the safety factors right, you can get very inconsistent results in terms of safety level from one structure to the next because of different levels of uncertainty, different loads, components and so on," said Eamon. "If you're expending limited resources, it makes no sense to have one structure 10 times as safe as another if they're the same level of importance. We're trying to get the level of safety to be more evenly distributed and more consistent."

Creating this new method could potentially help the areas of civil engineering, medicine and computing as well as mechanical and electrical areas.

Sources: HPCwire, Eurekalert

Comments     Threshold

This article is over a month old, voting and posting comments is disabled

Computing Power
By toyotabedzrock on 1/11/2012 5:42:22 PM , Rating: 2
Don't we have enough computing power on had to do these calculations much quicker?

RE: Computing Power
By lightfoot on 1/11/2012 6:19:36 PM , Rating: 3
If a new simulation can get a better (more accurate) solution in only 1,000 iterations than the current simulation can achieve in a million iterations, you should switch to the new simulation regardless of computing power.

This is about the quality of the model, not the amount of processing required. Saving processing power is only a side benefit.

RE: Computing Power
By Smartless on 1/11/2012 6:44:09 PM , Rating: 3
Lol I wonder how real-world they'll get?

1) Inconsistent integrity because contractors were lazy or trying to skimp.
2) Sub-par materials from sub-par companies.
3) Weather at time of construction. Cold, hot, humidity...
4) Unseen environmental factors? Tacoma-Narrows Bridge anyone?

Most failures I've seen were due to one or all of these. Of course, most projects are slightly over-designed but I guess that's the point of this simulation software.

RE: Computing Power
By EricMartello on 1/11/2012 8:07:34 PM , Rating: 1
The simulation is to assess the integrity of the design itself in a best-case scenario. For example, consider a building such as a bow-shaped skyscraper that has a glass tube as an express elevator from the ground floor to top floor where the bow string would be. It's an eccentric piece of architecture, and they could assess whether or not we have the materials to make it viable for construction.

The variables you listed are indeed possible causes for failure, but they would fall under negligence. I don't think they'll ever have a computer simulation robust enough to quantify human stupidity.

RE: Computing Power
By bebimbap on 1/11/2012 6:44:40 PM , Rating: 3
Don't we have enough computing power on had to do these calculations much quicker?

That kind of thought can be very dangerous. When I was getting my Computer sci degree, some students would argue that robust and elegant code was not required as you would expect hardware to increase so much that when executing brute force code the difference in perceived time of the end user would be insignificant. This train of though not only applies to electronics but anything in life.

There is a difference in convenience and efficiency

RE: Computing Power
By fic2 on 1/11/2012 7:16:19 PM , Rating: 2
Ha! I have had professional software people tell me the same thing. Granted I didn't think they were good at writing software, but that is what they were paid to do. I actually had to beg a team lead to let me change the dumb@ss interface that they "designed" to get data from my software. Then had to stand over another guy on his team and tell him which lines of code to delete or change. Shrunk their code to about 25% of what it was and made the data interface flexible.

RE: Computing Power
By DanNeely on 1/11/2012 7:02:34 PM , Rating: 2
IF doing a simulation takes weeks on a rented super computer, you'll probably only do one at the end and maybe one halfway through the design phase and pray everything works as you expected.

If it can be done overnight on a cluster of a few godboxes you can employ continuous integration style techniques to track how perfomance changes with your design and iteratively reinforce the weaker areas and reduce ones that are severely over-engineered. IF a major regression does occur you only have a small set of changes to examine to ID the root cause.

IF you can do it in a few seconds to minutes on a standard engineers workstation, the impact of each individual change can be tracked in realtime and you have the potential to use genetic algorithms to optimize the design for you.

RE: Computing Power
By kraeper on 1/11/2012 8:16:44 PM , Rating: 2
I'm sure a new model/code could do that, but a milder port designed to run it on a GPU instead of CPU would probably pay off faster.

RE: Computing Power
By mmatis on 1/16/2012 8:21:07 AM , Rating: 2
You apparently haven't seen what many IT organizations consider to be a "standard engineer's workstation"...

"I want people to see my movies in the best formats possible. For [Paramount] to deny people who have Blu-ray sucks!" -- Movie Director Michael Bay

Copyright 2015 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki