Print 14 comment(s) - last by TheEinstein.. on Jan 30 at 1:49 AM

Concern regarding teachers and how they transfer anxiety to some students

A new report published in the Proceedings of the National Academies of Science claims female teachers who show an anxiousness in math can share that attitude with female students.

The study was based on 17 first- and second-grade elementary school teachers, along with 52 boys and 65 girls.  At the beginning of the school year, researchers learned boys' and girls' math achievement didn't correlate depending on the attitude of their teacher.  

Boys weren't affected, but girls who began to believe boys are naturally better at math yielded lower grades.

"We are not sure whether it's something overt, whether it's non-verbal behavior or perhaps (teachers are) not spending much time on the subject," said Susan Levine, University of Chicago psychology and human development professor, co-author of the “Female Teachers' Math Anxiety Affects Girls' Math Achievement” study.  "It's not just a teacher's knowledge of the subject, but there' something about their feeling about the discipline."

This is a significant problem since there is a high demand for scientists, engineers and mathematicians, as these are in-demand jobs that help stimulate the economy.  However, men continue to dominate the engineering and IT jobs, and researchers believe it's detrimental to research to "dismiss 50%" of researchers because they are women.

The National Survey of Science and Mathematics Education indicates more than 90% of all elementary school teachers in the U.S. are women -- an inadequate level of mathematics study is required to receive a teaching certificate, which is something that may be addressed in the future.

Comments     Threshold

This article is over a month old, voting and posting comments is disabled

By TheEinstein on 1/28/2010 11:14:29 AM , Rating: 1
Here, enjoy my invention in binary math, see if you can follow it

Step 1:
Make 3 temp files
if x= 00 then 0 in each file
if x= 01 then 0 in files one and two, and 1 in file three
if x= 10 then 0 in file one, 1 in file two, nothing file three
if x= 11 then 1 in file one, nothing files two and three

Then use a binary compression algorithm on any of the temp files that will compress... or repeat the system if gains can be had.

This will bring maximum entropy to bear on ANY given file, with the least possible code to do so, and will not increase the over-head of any file by using a simple check vs size. Each temp file has a chance of statistical compression. This is the closest humanity will get to a 'random data compressor'.

Now when you can do theoretical binary math, follow large statistics thoroughly, or such, you can nay say someone saying what they say. Until then, how about you just say 'well I wont disbelieve unless the person gives me a basis to do so'.

By Donovan on 1/28/2010 5:04:17 PM , Rating: 2
For a random input your invention takes, on average and before any compression, 9 bits to hold 8 bits of data:

(2 bits) 00 -> 0,0,0 (3 bits)
(2 bits) 01 -> 0,0,1 (3 bits)
(2 bits) 10 -> 0,1,- (2 bits)
(2 bits) 11 -> 1,-,- (1 bit)

The first file has a probability of 3/4 for 0 and 1/4 for 1. The average number of bits required to encode one bit from that file is (using Log for log base 2 and omitting parentheses around the fractions):

E_1 = - 3/4 Log( 3/4 ) - 1/4 Log( 1/4 ) = 1/4 Log( (4/3 ^ 3) * 4 ) = 1/4 Log( 256/27 )

Similarly the second file has a probability of 2/3 for 0 and 1/3 for 1, giving:

E_2 = - 2/3 Log( 2/3 ) - 1/3 Log( 1/3 ) = 1/3 Log( (3/2 ^ 2) * 3 ) = 1/3 Log( 27/4 )

The third file has a probability of 1/2 for both 0 and 1, so it will not compress for random input:

E_3 = - 1/2 Log( 1/2 ) - 1/2 Log( 1/2 ) = - Log( 1/2 ) = Log( 2 ) = 1

For every 8 bits of input we have, on average, 4 bits in the first file, 3 bits in the second file, and 2 bits in the third file. After compression, the number of bits your invention will end up taking for those 8 bits of input is:

4 * E_1 + 3 * E_2 + 2 * E_1 = 4 * 1/4 * Log( 256/27 ) + 3 * 1/3 * Log( 27/4 ) + 2 * 1 = Log( 256/27 * 27/4 ) + 2 = Log( 64 ) + 2 = 6 + 2 = 8

In other words, your algorithm does nothing.

will not increase the over-head of any file by using a simple check vs size
It always costs you at least one bit to store the fact that you didn't compress at all...otherwise your decompressor will apply the decompression algorithm to the uncompressed data and return garbage. If you actually work out the details for your invention you will either stumble upon that bit or end up inadvertently hiding it somewhere in the file system metadata (such as the existence of multiple files).

Google for "Pigeonhole principle" for further details on why a lossless compressor must always increase the size of some inputs.

Each temp file has a chance of statistical compression
...which is just enough to cancel out the bloat you added. One difference: the bloat is absolute while the compression is only theoretical, so using a real compressor will leave you with a net loss.

This is why your teacher wanted you to show all your work.

By TheEinstein on 1/28/2010 8:36:45 PM , Rating: 2
A) Pigeon hole of course always trumps, your stupid to think otherwise.

B) However many files can be compressed also... lower than their original size. Your stupid to think otherwise.

C) Your math is bad. You assume only based upon random data. I do not. If the file is uncompressable, it is UNCOMPROMISABLE.

The system is designed to be used upon skewed data sets. Re-run your math now to try to account for text, software languages, and the likes.

You tried, I will give you credit, however you completely failed to apply basic concepts in your attempt to act high and mighty.

As for showing how you got your work, "A Brilliant mind". He showed how he did some math through his life, but other maths he would just do, and never explain how he got the answer.

There are math geniuses who do not ever need to 'do the steps' to get the answer. They can adequately walk through complex math principles merely telling the answer. 10 internet points if you can name one of the conditions that allows this.

By incarnaterage on 1/29/2010 1:55:25 AM , Rating: 2
You said in your previous post that this will bring maximum entropy to "ANY give file" (sic).

Except Donovan just.... proved otherwise. Poor Fella.

Furthermore, yes, its great that Fermat jotted random theorems in his notes and then noted "I can prove this... but I need to feed my cat". Well pat on the back for him, good job. Except that none of his work serves as a firm basis for other scientific works because of the lack of rigorous proof.

The main point however, was never that you didn't know math. The point of your middle school math class, or any other topical class wasn't to teach you the pitiful content matter at the time, but rather to teach you to follow direction and instruction.

But /thread.

If you want to go through life ignoring what people in charge tell you to do, I wish you the best of luck. You should go point a gun at a cop and then argue with him when he tells you to put it down

By TheEinstein on 1/30/2010 1:49:00 AM , Rating: 2
On true random data 8 to 8 is the best of entropy for the given data. That is indeed the lowest entropy of the file.

When data gets a little 'lopsided' then you can get better than 8 to 8 ratios.

So no, he did not prove anything.

"So, I think the same thing of the music industry. They can't say that they're losing money, you know what I'm saying. They just probably don't have the same surplus that they had." -- Wu-Tang Clan founder RZA

Most Popular Articles5 Cases for iPhone 7 and 7 iPhone Plus
September 18, 2016, 10:08 AM
Laptop or Tablet - Which Do You Prefer?
September 20, 2016, 6:32 AM
Update: Samsung Exchange Program Now in Progress
September 20, 2016, 5:30 AM
Smartphone Screen Protectors – What To Look For
September 21, 2016, 9:33 AM
Walmart may get "Robot Shopping Carts?"
September 17, 2016, 6:01 AM

Copyright 2016 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki