According to some new study, which supposedly suggests that men are less "receptive" to "messages" that we ought to eat more fruits and vegetables. It also looked at something called "the theory of planned behavior," which (if I understand this correctly) says that people construct beliefs to match their behavior. Then there's this additional idea that it's because men are less able to get their fruits and vegetables that they develop the belief that it's not really so much of a problem.
By contrast, women — who get their fruits and veggies — end up thinking this food will make them good-looking and long-lived. So, if only men did eat more fruits and vegetables, they'd arrive at the appropriate beliefs. But how do you get men to do that unless they believe it's good?
See how they're trying to reverse things? Do it, and then you'll believe it's good. We don't want to have to convince you that you should do it so that you'll do it. We just want you to do it, and the belief that it's good will follow, pursuant to the theory of planned behavior.
But wait. Is it good? When was it ever proved that eating fruits and vegetables is important? Maybe men don't believe it because it's just been mostly nothing but a folk belief all this time. Why assume the women are the norm and men are misbehaving? Maybe men demand evidence and don't simply follow the dictates of experts.
I got this link from Instapundit, who just says: "FEAR OF E. COLI? Why men don’t eat vegetables." Maybe there is an instinctive resistance at least to raw foods. What I'm resisting is the ever-irritating bullshit science that presents whatever is true of women as what's good.