In our American culture, plastic surgery is supposed to make people look younger and prettier, and it’s supposed to make them feel more confident and socially acceptable. To me, all those feelings should come from within. A lot of people say I am genetically lucky, so I shouldn’t mouth off about how lame plastic surgery is, but I’m no Barbie doll, okay? There are plenty of things I don’t find perfect about myself physically. I’ve just accepted them, because they’re really not that important.
The worst thing about plastic surgery becoming mainstream to me is that, the more socially accepted it becomes, the weirder people are starting to look. And people just assume that you’ve had it if you have a small nose or big knockers. I’ve never had anything done, and I don’t intend to. One thing I’ve certainly learned over the years is to never say never, but I just can’t imagine going under a knife to look prettier. Is it just me, or does anyone else out there think that people who don’t feel good about themselves should try working on their insides first?