is our culture obsessed with money/prestige? I need some alternative viewpoints. I've been at two universities now (one for undergrad and one for grad), and it seems like a lot of students I come across want to be doctors, lawyers, or investment bankers. It's like there are no other career choices. Like, if someone comes into college wanting to be a doctor, but decides they don't like it, they'll apply to law school instead. I know a number of people who, when they graduated, sent out apps to both med schools AND law schools...talk about not knowing what you want to do in life! While I believe that some of them truly do love the profession they are pursuing, I can't help but think that most of them are just motivated by money and prestige. Sometimes it really frustrates me that people can't look past the $$$ and decide, hey, maybe there's something else in life for me. Have things always been like this, or have they gotten worse in the past few years?