I was talking to a friend lately about how I'm in my mid 20s and how I still feel like a kid. She then made a comment that has really struck me. She said that you are always a kid until you are responsible for one. In essence, she explained that people don't know what it truly means to be an adult until they have had kids. Relationships, occupations, financial responsilbities, etc. are all just child's play compared to the immensity of creating and raising a child.
Do you parents agree with that idea? Did having kids change who you were? Did it make you feel less like a kid yourself and more like an adult? Do you think that people who don't have kids are not truly adults in some sense?