|
I'm just sayin'. Sometimes it irritates me. I've been working with kids in group homes and in all the various permutations of foster care for almost ten years now. I can't claim that I've been madly in love with the job for every second of that time, and I don't plan on doing it forever. There are a couple of asshats where I work, as I assume there are everywhere.
By and large, though, I've been surround by some of the most caring and patient people I've ever come across in my life. They suffer all kinds of bullshit and poor treatment on a daily basis that very, very few people would put up with. All while getting paid shit.
In the movies, though, and on tv...the social worker is always the 'bad guy.' Oftentimes condescending, rude, and embarassingly brief in their visits and interviews, social workers are portrayed as being out to get the parents and as apparently being terminally stupid- i.e., child has a single bruise on his or her shin and social worker, after talking to family, decided to call in an abuse report.
Where's the consideration of the child's daily activities? How about looking for any bilateral bruising, multiples (new bruises on top of old bruises)? How about taking placement into consideration? Bruises are common on protuberances and bony areas of the body, we tend to worry more when they're on the 'softer' bits. We are, actually, trained on what to look for and how to consider the whole picture in that kind of situation. I never have known a single social worker who got as excited about reporting or removing as they seem to in the movies.
Ok. End of mini-rant. Anyone else out there get crapped on by society's view of what their job is?
|