backtop


Print 42 comment(s) - last by GhostR!der.. on Dec 16 at 3:45 PM

YouTube user siniXster believes the NASA camera footage is a UFO, but scientists say its just a ghost of where Mercury was positioned the previous day

A video making its way around the Web has many wondering an age-old question: Are we alone in the universe?

The Heliospheric Imager-1 (HI-1), a camera system aboard NASA's STEREO spacecraft, managed to capture some questionable footage last week of what some are calling a cloaked spaceship orbiting Mercury.

The footage caught a coronal mass ejection last week, where electronically charged material catapulted from the sun and hit Mercury. The solar flare that struck Mercury seemed to have hit another large object about the same size nearby, which was described as cylindrical on both sides with a shape in the center, resembling a spaceship. 

YouTube user siniXster posted the footage on the Internet saying that the object in question was clearly a cloaked spaceship that may be indicative of alien life.

"What object cloaks itself and doesn't appear until it gets hit by energy from the sun?" said siniXster, who also mentioned that there could be no other explanation for the Mercury-sized object.

However, scientists at the United States Naval Research Laboratory (NRL) disagree. According to NRL's Russ Howard, head scientist, and Nathan Rich, lead ground systems engineer, the mystery UFO is actually Mercury itself. It is simply a ghost of where Mercury was positioned the previous day, and was visible due to the way raw HI-1 telescope data is processed.

Howard and Rich explained that NRL scientists typically remove background light when processing such data in order to make the glow of a coronal mass ejection apparent against the bright glare from space. They identify what light is background light by calculating the average amount of light that entered individual camera pixels the day of the event as well as the day before. Light found in the pixels on both days is background light, and is then eliminated from the footage while the rest of the light is enhanced. This process is relatively easy for bodies like stars, but for those that are closer and move, such as planets, the process is a bit more challenging.

"When [this averaging process] is done between the previous day and the current day and there is a feature like a planet, this introduces dark (negative) artifacts in the background where the planet was on the previous day, when then show up as bright areas in the enhanced image," said Rich.

When this footage is reprocessed from a different day, the bright spot disappears due to different pixel values and removed light.

It looks like believers will have to wait another day for E.T. to phone home, but take a look for yourself and see what you think:

Sources: Life's Little Mysteries, Yahoo News



Comments     Threshold


This article is over a month old, voting and posting comments is disabled

RE: Really!?
By Solandri on 12/8/2011 5:34:43 PM , Rating: 3
This is a tried and true practice in space imaging. The same technique has been used to verify exoplanets (planets orbiting other stars) and stellar accretion disks. Normally there's so much light coming from the star that it obliterates anything as dim as a planet. So you build up an average image of the star, and subtract it from subsequent photos. This mathematically removes the star's light, and out pops the planets or disk.
http://blogs.discovermagazine.com/badastronomy/201...
http://blogs.discovermagazine.com/badastronomy/201...

I remember they even had to do this with images from the Voyager spacecraft. The cameras wouldn't produce a uniform black. Some "pixels" (Voyager used scanning TV cameras so didn't have true pixels) were "hotter" than others so would consistently show up as a shade of grey instead of black. So they poured over hundreds of pictures of black space (so they wouldn't mistake a hot star for a hot pixel), and built up a huge map of the hot pixels. Then in all subsequent pictures, the value of these hot pixels were normalized to make their brightness consistent with the normal pixels.

Digital SLRs can do the same thing via a setting called long exposure noise reduction (mostly used by astrophotographers). In this setting the camera takes two pictures. First it shoots the picture you want. Then it immediately takes a second picture with the same exposure, but this time with the shutter closed. That creates a "black frame" where the only things in the images are the hot pixels. The camera then subtracts the second picture from the first, greatly reducing the noise in image.
http://www.kenrockwell.com/nikon/d200/d200-dark.ht...


RE: Really!?
By JediJeb on 12/9/2011 11:44:37 AM , Rating: 2
That "black frame" is also sometimes called a "dark frame" in astroimaging software. Anyone interested can read up on the Meade or Celestron or SBIG websites for introductory info on CCD astroimaging.


"If you can find a PS3 anywhere in North America that's been on shelves for more than five minutes, I'll give you 1,200 bucks for it." -- SCEA President Jack Tretton














botimage
Copyright 2015 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki