Digital Determinants of Health
I attended the Healthcare IT Advisory Committee meeting this past week (https://www.healthit.gov/hitac/events/health-it-advisory-committee-43). It was a fascinating day of presentations from a variety of experts across the healthcare spectrum discussing social and digital determinants of health. Part of the discussion was around the growing 'digital divide', but it's bigger than that. The meeting touched on these aspects numerous times, but I'd like to really focus in on digital aspects as a growing risk to the health of certain populations. It's something I think about a lot, so here's the beginning of the conversation.
In the U.S., we've been talking about social determinants of health (SDOH) for decades. In recent years, organizations and IT departments have been talking about digital transformation, which is essentially the push to digitize that which can be in order to improve some aspect of the business such as improve operations, profitability, patient experience, customer or staff experience, etc.
The digital transformation may be well underway, but the discussion rightfully is now turning to digital determinants of health as an aspect of the overall equation. In the past, it's been generally accepted that 20% of a person's health is achieved through interacting with the healthcare system and 80% is impacted outside of that setting. I'd posit that the ratio is shifting a bit because digital determinants impact both healthcare and non-healthcare settings.
As the CIO of a large Federally Qualified Health Center (FQHC), I am increasingly concerned with the digital aspects of health for our patient population. I see disparities in access to affordable Internet services, access to smartphone technology, and overall digital literacy in some populations. And, it's not always tied to socio-economic factors. Consider the fact that many healthcare providers struggle to use healthcare technology and that is a major contributor to provider burnout (more on that in another post). So, it doesn't always matter how affluent or educated you are, it may have to do simply with one's overall comfort with technology. We all know people who are master "gizmologists" (my term) - those who can reset the wifi router, configure Settings on your phone or troubleshoot an application problem with ease.
On the flip side of this equation, people (especially younger generations) who are fully digital are leaving behind a robust digital trail of data to be consumed and analyzed by algorithms and machine learning. That will naturally extend and exacerbate the digital divide as we generate assumptions and hypotheses based on data that excludes large segments of our population. In many cases, it's not accurate to extrapolate based on existing data because the populations are not at all analogous.
I'm not a data scientist, but as a healthcare IT leader, these are logical conclusions to draw.
Our challenge in healthcare IT is to look across social and digital determinants of health to ensure our communities and our populations are well-served by the work we do.
Digital equity means ensuring all members of our community have equitable access to technology and the ability (and choice) to use that technology to participate in improving their own health. That means access, education, and support on many different levels, few of which exist today. We need to look at social service offerings and include technology among these referral and support options.
I don't know exactly what the answers look like, but with the spotlight swinging toward digital equity as a part of the overall determinants of health, I know we're at least beginning to look in the right direction.
Please join in the conversation at a local, state, national, or global level - wherever you feel you can make the biggest difference. Thanks!