Categories
mobile Presentation Usability

#iOSDevUK: Hacking Health

most common chronic conditions
Credit: Centers for Disease Control and Prevention

My notes from the talk Emily gave at iOSDevUK.

What are health apps?

  • Step counters
  • Fitness trackers
  • Diabetes apps
  • Heart rate monitors
  • Bluetooth enabled medical devices

Apple and Google have decided this is where the future is. Gone in. Apple, Healthkit. Google, GFit. Standardized APIs getting information with defined types, centralised storage. Enables gathering data from a range of different sources, don’t need to connect with a billion different APIs. Fine grained permissioning – user is in control.

If user says no, can no longer see that information is even there. E.g. if you know blood sugar is being stored, even if can’t see it, can infer things.

Parkinsons App:

  • Insight into effects of personal choices.
  • Better understanding of redactions to medications.
  • Ease of sharing information with care circle.
  • More accurate information provided to medical practitioners.

Parkinsons – sleep and eat has tremendous effect. Give people information on this, also give them control over the life. Insight into reactions to medications.

Parkinsons patients see consultants for 10 minutes every 6-8 months. Have to provide ALL information, and practitioner has to provide information. Done using a questionnaire. Incredibly difficult thing for users/patients to be able to remember. Influenced by their mood when they fill it in. Helping people see on average every day, able to use that 10 minute slot far better.

App:

  • Enter and alert on medication schedule.
  • Track adherence.
  • Track lifestyle factors, mood, diet, fitness etc.
  • Measuring severity of symptoms (e.g. use gyroscope to measure tremor, compare before and after).
  • Track side effects.
  • Allow correlation between lifestyle choices and presentation of condition.

Issues:

  • Ethical
  • Legal
  • Technical

Do no harm. Hippocratic oath. We are devs not doctors, probably not going to do harm, but have a duty to our users that our app don’t cause them to do something that will cause them harm.

Patients vulnerable. Can make decisions based on what you show. E.g. diabetics and blood sugar.

E.g.:

  • Self diagnosis app:
    • Misdiagnosis.
    • Delay in seeking proper medical advice.
    • Self medication problems:
      • Unknown interactions.
      • Unwanted side effects.

Pay attention to potential harm. Think very carefully about design.

Respect:

  • Your users are more than their condition.
  • Think about people rather than patients.
  • Use language carefully.
  • Think about how you word and time notifications (e.g. if giving a presentation from mobile, what if interrupted? Allow to turn off).

Consent:

  • People want to protect their personal medical information.
  • Informed consent around data sharing and collection.
  • Opt in, not opt out (granular control).
  • HealthKit and GFit permissioning.
  • If not prepared to tell people what you’re exactly doing with their data, think about what you are doing.

Stats are hard:

  • Be careful if use stats to tell people how safe it is.
  • People are often scared by statistics.

Transparency and Honesty:

  • Users will not share data with you unless they trust you with it.
  • Expose your ethics, standards and decision making process.
  • Warrant Canary – libraries in the US used to put a sign in the window, saying “FBI has not been here to raid information”. If removed, it’s a sign to indicate, even when they couldn’t tell people that they had been raided by the FBI.
    • rsync.net – first company to use things.

“When you start to gather and store information about a person that they would normally only share with their closest family and medical carers, you have a responsibility to that person to care about what happens to that data. If you do not care, in my opinion, you have no business working with private, personal medical information.” ~Emily

Legal Stuff

The diagnosis Line (what is and isn’t diagnosis).

  • Example: 23 and Me
    • Sent back statistical likelihood about genes you are carrying.
    • People don’t understand stats, were interpreting as a diagnosis.
    • Rebranded as genetic detection service (gave people analysis, no conclusions).
  • If taking data, analysing it, presenting conclusions, can be interpreted as diagnosis. This may need to be regulated.
  • US and Europe have different rules.
  • Best to present information, allow users to draw conclusions themselves.

Data protection app:

  • Only collect what you need.
  • Keep it secure.
  • Ensure relevant and up to date.
  • Only hold as much as you need for as long as you need.
  • Allow the subject of the information to see it on request.
  • Fair processing: ensure it is handled in ways that are transparent and that they would reasonably expect.
  • Do not transfer outside of the EEA unless compliance is ensured.

HL7 and HIPPA

  • Standard for sharing health data and US version.
  • International standards for interoperability of health information technology.
  • HealthKit does not conform to HL7 but does to HIPAA.

Don’t overlook data. Don’t lose anything.

Technological

Secure storage:

  • Disk encryption.
  • Public key infrastructure.
  • IP security.
  • Data masking.
  • Data erasure.

Apple doesn’t seem to have published how they are storing.

Not just about how you’re storing but also about your process. If only need to bribe one person, then your data is not secure.

Pseudonymisation:

  • Huge topic.
  • Ensuring individuals are statistically hard to identify from data.
  • Separating out PII from other information:
    • Different servers, databases.
  • Why should they not be identifiable:
    • E.g. Cancer patients data leak. Sold onto a research company, contained contact data and occupations. Patients were contacted directly, and asked intrusive questions.
  • Who is accessing your data and what do they need?
    • E.g. Insurance company. If could recognise people, might give them higher premiums because of things like not taking medication on time.

A11y:

  • Good practise.
  • Think about who your audience is.
    • e.g. Parkinsons, tremors.
  • Coordination symptoms.
  • Medication side effects.

Miscalibration:

  • E.g. Therac-25
    • Radiation machine. One high powered beam used with something else, other low.
    • 6 accidents resulted in 6 patients being given 100x intended dose.
    • Caused by a race condition caused by a byte counter overflow in the calibration.
    • Poor calibration could cause a lot of harm – giving people bad information about their medical state.
  • Check and double check calibration.
  • Publish your algorithms.

Localisation – conversions:

  • HealthKit and GFit provide APIs for this.
  • Even NSA get this wrong:
    • E.g. Mars client auditor.
  • Language.
    •  American Airlines. “Fly in leather” campaign, became “Fly Naked”
    • Dairy association. “Got milk?” became “Are you lactating?”
    • Pepsi. “Pepsi will bring your ancestors back from the dead”

Data provenance:

  • Where does data come from, and can it be trusted?
    • Important both for data you use and data you provide.
    • Especially if selling on to research organisations.
  • How accurate is it?
  • How could inaccuracy hurt my users?
  • Impact of HealthKit and GFit. You do not know where that data is coming from.

Why Bother?

Common causes of death. If could make apps to make these people to live more fulfilling lives, or prevent them from getting that condition in the first place.

Most common chronic conditions: high blood pressure. Altzimers. Could improve lives,

  • Improve lives, maybe even save a few.
  • Empower people.
  • Improve quality of care.
  • Provide data to help solve.

One reply on “#iOSDevUK: Hacking Health”

Comments are closed.