Senin, 21 April 2014

Cedar-Sinai Nurses Now Will Screen All Hospitalized Adult Patients for Depression-Any False Positives As Being in the Hospital Is Enough To Depress Anyone, Scoring or Subjective Information?

This is interesting as I indicated in the title here, nobody wants to be in a hospital today.  As I read here there are 2 key questions that will be asked of each patient to get a general feel to see if there are signs of depression.  How this is implemented and carried through, the big question too.  A couple years ago Cedars closed their in-patient and out patient psychiatry services but they still seem to have a Department of Psychiatry. 

Cedars-Sinai Hospital Closing In-Patient and Out-Patient Psychiatry Services–Will Give Grants To Nearby Clinics

The approach here is to look for certain characteristics of depression when patients are admitted.  You get two questions and if the responses are within what you might call a “danger zone” perception,image then you get additional questions to zero in for more information.  They want to find out if you are suicidal for one and are not getting enough sleep.  Well once you get admitted to a hospital, that’s takes care of that issue as you won’t get a lot of sleep there:) 

My question here too is does this become part of the patient’s medical records with full on documentation to where the screening is “scored” or does the nurse leave the entire subjective in the patient file so a doctor would have more than a “score” to look at?  This is an opinion sort of thing too and if they are going to screen and do this on an official basis, then why not do a second opinion while you are at it? 

Common sense and just doing your job has nurses and doctors on alert anyway usually to look for symptoms and I assume this is now a standardized function here with questions that have been approved by the board to use.  If this is an enhancement effort to get staff to recognize signs of mental health issues early on, then that’s a good thing, however is this is more data to add to a medical record that contains a lot of potentially subjective material for a “scoring” purpose, then the whole idea is lost and we are just collecting data which may or may not be accurate. 

If you look at the patient activation scoring, there’s a good example of where subjective information can get left out as you get a score that you are a one, two, three or four with your knowledge of your health.  As something like this works into a care system, with 4 being little knowledge, enter the efficiency folks and then it graduates to doctor’s time and to how many 4s or 3s can a doctor see while being productive as these folks have been scored to indicate the doctor will need more “talk” and “education time with them.   Are you going to hear the doctor say “don’t schedule me more than one or two 3s or 4s today” as those will require too much extra time with my tight schedule today”…

I made this point a while back as a real old NIH study was used to back this type of scoring and then you have the double whammy here on how well is the doctor scoring the patients the comes next.  Sometimes people take studies and try to wrap software and scoring systems around whatever context they take from the study and sometimes you do get some really strange perceptions on how to use study finding.

Patient activation Scores - Quantitate This Too With a Score? Guess We Have Forgotten How To Be Human, Are We Instead Creating More Risk Assessments In the Pursuit To Reach the Top of the Heap Of The Profitable “Junk Science” Department?

So if this done with good common sense and not yet another “score” created then it’s a good thing to watch for signs of mental illness as clinical professionals should be doing that anyway, but when you make items like this a numerical data function to put into a patient’s chart, well there’s lots of room for errors and again this is not a perfect science either.  Recently the World Privacy Org put out a report as referenced at this blog post and it focuses on the US, and how citizens here are over scored right and left without anyone questioning the proprietary mathematical and algorithmic processes of how this takes place, so proprietary algorithms in healthcare and in other areas of life silently become law.  The model should be open so others can replicate and validate the use and value.  BD


Newswise — LOS ANGELES (April 21, 2014) – In an effort to identify and treat patients with undiagnosed depression, Cedars-Sinai nurses are screening each hospitalized patient for signs of the illness and for risk factors that could make recoveries harder and longer.

The new initiative is believed to be one of the broadest depression screening of patients in a U.S. medical center.

Although many illnesses are associated with some feelings of anxiety, stress and fear, the new screening process is designed specifically to help detect symptoms of clinical depression characterized by a severely disheartened mood, lowered activity level and persistent negative thoughts lasting longer than two weeks. More than 18 million Americans about – 7 percent of the adult population – experience major depression each year.

http://www.newswise.com/articles/view/616715/?sc=rsla&utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+NewswiseLatestNews+%28Newswise%3A+Latest+News%29

Tidak ada komentar:

Posting Komentar