Read More »
Today’s topic – wisdom in health care. John Halamka has a great article in Harvard Business Review. It is an opinion piece discussing the danger of the big data craze and how patients with their wearable devices, lifelogging software, and Google searches self-diagnose themselves into an early grave. They can’t translate the heaps of data into wisdom. The algorithms in their health care apps at best get them to the information stage, with some loose guesses on knowledge.
Using the cuff, I took my BP before and after commuting, drinking tea, and attending anxiety-provoking meetings — nearly 100 measurements in a week. The raw data were just numbers, although they helped reveal interesting information — that none of my life activities (commuting, tea drinking, work) influence my blood pressure. The problem, logged as a discrete data point in my electronic health record (EHR), turned out to be my parents.
Today we will have a guest post from the content manager and University of Florida PhD Student France Jackson. We are continuing the discussion around HFES 2015. During UX Day at the annual meeting, I was honored to be a participant in the UX Leadership Development Workshop. Today, I want to do a guest post and share my experience. Prior to the event, participants were informed that there three theme areas we would be discussing at the workshop and to prepare our thoughts and talking…
There is a new trend that is putting the human back into the picture. Smart organizations have come to realize that as powerful as algorithms are, people are just too complicated and diverse to be modeled with math alone. They acknowledge the incredible value that human insight (the third type of thinking I teased here) can add to even the most sophisticated algorithm when it comes to understanding and predicting what a user might be interested in. If I ask an algorithm to find songs that have Jack Kerouac lyrics and Sex Pistols music, I will have to wait about 20 years for it to figure something out. But a 1970s style DJ could do it in a heartbeat.
Users are obviously the best judge of what they want to see, but we’re also notoriously reluctant to spend time and effort curating—take a look at the Google+ Circles experiment, for example. Facebook is likely hoping these changes help strike the right balance between control and convenience.
A study recently published this month by a data scientist at Facebook brings up some really interesting issues about ethics, big data, and the monitoring and collection (and manipulation) of our behavior on social media. This topic is important for all of us because data is being collected for all kinds of reasons: basic research, design, user-modeling, ethnography, and many others. So no matter what sector you are in, this matters to you…