Unlock the full post by becoming a patron
Join Now
Andy Coravos: Safe and sane at the intersection of human health and tech

Greetings much later on this Thursday than I had planned, but as they say, man plans, the universe laughs.

Yes, here we are, and without too much lip flap from me,  you’ll hear my conversation with Andy Coravos of Elektra Labs, who’s also deeply involved with one of my favorite things, the Defcon Biohacking Village and their #wehearthackers stuff. Oh, and she’s also an advisor to the Harvard Program in Therapeutic Science. And you can find her on Twitter when she’s not doing one of the eleventy-million and five amazing things she’s up to on an average day.

Short snort on Elektra Labs, direct from their website: “Elektra Labs offers solutions for any stage of the biometric monitoring implementation process, whether organizations need assistance learning best practices, choosing or deploying connected products for remote health monitoring.”

In “explain like I’m five” terms, Elektra Labs helps turn stuff you can put on your body to see how your body’s working into stuff that medical research folks can turn into cool new treatments for diseases, or cool new ways to manage an illness or condition. And they make sure that stuff is safe to use.

Andy’s work revolves around the intersection of health data, biosensors, biometrics, drug development, and cybersecurity. That’s a biiiiiig intersection, with a metric shit ton of traffic in Normal (Is A Dryer Setting) times. During a pandemic, it’s like finding out you’re at the helm of the Millennium Falcon with nothing but the user manual for a Ford Pinto in your hands, and the whole Empire on your six.

She’s so calm, too, which is, I’m sure, why she’s so very, very good at what she does.

No more lip flap! Here we go.


Casey Q: I was fascinated when I looked at your LinkedIn profile and saw that early on in your career, you'd been working with the Washington Capitals. Now, being a Ranger fan, that did bring me up short, but that's fine. So, what did you do for a hockey team? How did that, how did that intersect.

Andy Coravos: Yeah. So, I was a freshman in college and it was my first, paid summer college gig that I had. I was very excited. 'Cause a lot of freshmen friends couldn't find paid jobs. And so, I was, I was excited about that. My dad's a dentist and when I was growing up, a lot of ice hockey teams didn't have to wear mouth guards, which is kind of amazing to think about, so they needed a dentist on staff. And so, as a kid, I would go to tons of hockey games with all these other young kids who all had their teeth knocked out. I've loved hockey for a while, and very glad that everybody gets to wear mouth guards now, and it's much safer. My dad doesn't really have a job anymore in that type of field.

I did everything that happened between the games. So, I was an intern that worked with the t-shirt launchers, the ice girls that would go on the stage. I made the trivia that went on the jumbotron.

Casey Q:  I was, back in the day, a part of the audio crew that would be mixing the game or hanging off the plexiglass, hanging pressure microphones.  It was fun. And I didn't really like hockey until I started doing games in the arena. And then it was like, Oh, I get it now. I get it now.

Andy Coravos: The craziest thing that I did that summer was their mascot is a bald eagle. And they asked me to get a bald eagle. And I thought that was a joke because I was like, well, they're endangered. So, is this a joke you play on the interns?

And then one of my teammates said, actually, no, they're not endangered anymore. And so that was the year that they came off the endangered list. So, I called all these exotic pet stores asking to have a bald eagle and they all laughed at me, until finally I called the Department of the Interior that had the celebration, and there's one bald eagle that will do shows, his name is Challenger. When you call him, his handlers' ring back tone is "Fly Like an Eagle," and he only flies Delta. And I found a bald eagle for the Washington Capitals.

Casey Q: I guess it's like, "what would I have done? " I guess I probably would have called the zoo, but they probably would have laughed at me too, but good for you. The Department of the Interior, you can always count on them for American animals. Although if they'd asked you to get a buffalo, the answer to that would be no, just ... no, you don't want one of those anywhere near you. They don't smell good. Eagles are much nicer. I mean, as, as animals go. So how did you end up in the healthcare and biotech space? 

Andy Coravos: I always thought I'd end up in healthcare. My dad's a dentist, my mom's a nurse. And I thought that was the only job. My parents had never switched their jobs. So, I didn't even know people ever change their jobs, so I just figured I'd end up in healthcare. It's all we talked about growing up. I am abysmal with blood, horrible with it. I really don't do well. And so, my family joke was always, how did you end up in healthcare if you were so bad with blood?

When I was in high school, I interned at the Boston Children's Hospital and my mom is a nurse and worked in the operating room. So, I got the patients after she had worked with them, and still was just, it was not my thing. And so finally we figured out how I would be working in healthcare and it was through software.

Casey Q: Back when I was still working in the news business, one of the things that I ended up carving out was I was the only one who knew how to run the Canon camera that had the adapters that went on the microscopes. You know, the electron microscopes of the time, this was back in the eighties and nineties. I would get dispatched with this Canon camera and all these adapters to fly hither and thither on the East coast to labs, to go and shoot the thing where they were doing whatever. And yeah, usually it was dismantling mice.

Sometimes it was dismantling larger mammals, but I was always glad that the viewfinder was black and white. I don't have a huge problem with blood, but it would not have been my first choice, so the black and white viewfinders are really helpful, but sadly, all viewfinders are now ... they're your phone and they're color.

Tell me about Elektra Labs' COVID-19 remote monitoring tools project, tell me about how that's going and what you hope to come out of it.

Andy Coravos: Sure. So, I'll tell you what we normally do with Elektra, and then how the COVID project happens with that. So, the work that we're doing is we support pharma companies that are collecting biometric signals at home.

So, imagine that you're in a clinical trial, and you want to collect somebody's heart rate or blood pressure or tremor. Most people who are in clinical trials don't actually want to go to the clinic, that takes more time. You want to have your normal life at home. And so now there's all these sensors and wearables, a lot of pharma companies are wondering, can I do decentralized clinical trials, can I collect more data at home? So, people don't have to come in, and there's a ton of different sensors that are good and could be used in a clinical study. More and more pharma companies are using digital tools for primary and secondary end points, which is a huge deal in submitting to the FDA. So, you'll notice I don't use the term device because sometimes these are FDA devices, and sometimes they aren't, and they don't have to be, necessarily, in a clinical trial.

So, through that work, our team has tracked over a thousand different sensors, and across 300 different medical conditions, and we're tracking about 3000 different types of measures. So even something like sleep, there's something like sleep efficiency, sleep latency, number of times you woke up in the middle of the night, number of bouts of deep sleep, light sleep.

And so, we're sitting on this treasure trove that was normally just used for pharma. And we realized a lot of our pharma customers were looking at vital signs. And if we took a subset of our atlas during COVID, now that everybody has to stay at home, could we release the vital sign data for more people to use it?

Casey Q: So, are there any consumer facing devices - I’m not asking for brand names just in general - are there any consumer facing devices that are already on the market that have been, that have proved useful in this work? I'm just curious.

Andy Coravos: Yeah. One of the biggest misconceptions that people have is that there's a difference between medical grade and consumer grade. So, one of the questions I'll just ask you is what would you consider to be consumer? 

Casey Q: Well, there's the thing I have on my wrist right now, which talks to my phone because it's the same brand. And this is the second iteration. this is an upgrade from the one that I had previously. What I like about it is that it reminds me to get my fat ass out of the chair every once in a while. And it also tracks my heart rate. And I do wear it to track my sleep occasionally, but it's also good to just have sort of a running record of things like heart rate and activity levels for my own reference, just because I'm that kind of geek.

But I have not yet found any clinical care team that is interested in taking in this data. Not as though I've offered it to them. They see I have the watch; I bring up the thing, you care about any watch stuff? No. Okay, great.

Andy Coravos: So, the consumer versus clinical grade is a super interesting question, because what you said is consumer means that you can buy it. And then what people often define as medical grade is whether or not it's FDA cleared. And it turns out that these things are increasingly merging. So, two apps on the Apple watch got cleared as medical devices. Interestingly, not the watch itself, but two apps on it.

So, the OTC, over the counter, ECG app, and then the algorithm that they're using for the PPG, which is the green light, both got cleared as software as a medical device. So these are consumer products. AliveCor has created an AFib detecting tool that is also a consumer device, that can be bought by consumers.

And then there's a number of other products that are similar in nature that you can, it's pretty easy to get a quote prescription for them. So, Abbott has a Freestyle Libre, which captures continuous glucose readings. And so, the really tricky thing is that just 'cause something’s consumer doesn't mean that it's worse than something that's medical grade.

So, there's a ton of FDA cleared products that are actually quite bad. Because something got FDA cleared doesn't mean that it works. And so, one of the tricky things about FDA is that you need a predicate to go through the devices group. And so, you just show that something looks like something else, you don't often have to share a whole lot of data that it actually works

Casey Q: The 510k rule.

Andy Coravos: The 510k process. So, if you look at John Oliver, there is an amazing thing that will make you laugh and cringe at the same time, for how the predicate system really blurs this line. And so, whether or not something's FDA cleared, isn't always that useful to know whether or not it's quality data or not.

Casey Q: Yes. And then there's the film The Bleeding Edge, which I know some of the people who were in that and that's, I mean, no process is perfect, and I suppose that that's another process we can point to and say, well, not perfect.

Andy Coravos: Yeah. And in the age of COVID, there've been tons of doctors who know that it's more important for people to stay at home as much as they can. And so, people have been using a lot more remote monitoring products and connected products.

And so, it's really just figuring out is it accurate in this population? And I think one thing to think about is, no one would ever say, is this a good drug or is this a good food? But we say, is this a good heart rate monitor? And you say, well, what's the drug for, what's the food for, what do you need for this heart rate monitor? And with drugs and food, what we've created are things like labels. So, drugs say, what population was it tested on? Same with any of these smartwatches, we should say, well, what population was it tested on and how are the algorithms developed? And so sometimes with food, you might need more sugar or you might need more protein. And so, same with these wearables and sensors.

Maybe you need more security. Maybe you need more data rights. Maybe you need more usability. And so, I think something just for people to think about is all of these tools are connected to the internet. And I know that you and I went to Defcon together. And so, a lot of people aren't thinking about the security exposure that they have when they're connected to a medical product or a consumer product, that's collecting quote, unquote, health data.

Casey Q: Particularly when it's attached to your phone, I mean, I do think about that a lot. But again, because it gives me information that I find useful, and I also am sophisticated enough to go to know how to go in and tune some of the settings in some of these things. But I do worry about the average human, whether or not they're going to take the same steps or just do the usual "oh yeah, I accept," just because I want this app, this thing, this tech tool to work, so I have to accept the terms, so I do and I move on. It's like, Oh, wait a minute, wait a minute. My part of the party is to try to boil those terms of service down to something that an average human being could actually read, and not be 40 pages long in six-point type. But that's a story for another day.

Andy Coravos: That's super interesting thing. If we take a moment on that, because I think most people don't realize that there's a difference between security and data rights. So, security, the way I think about that is that's an attack on a system. When you have bad data rights, that may not be an actual attack on the system.

So, things like Cambridge Analytica, that was not a security incident at Facebook, that was misinformed data rights. And so, one of the scariest things is one, do people read the terms of service? And then the second thing is, does it matter? And so the craziest thing about that is there's a team at Harvard, who published a big paper in JAMA that we can link to at the end of the podcast, and what they did was they looked at smoking cessation apps and they said, do you share the data with Google or Facebook in this app? And a certain percentage of them said, no, we don't share. And then security researchers looked at the app and said, do they send packets to IP addresses that say Google and Facebook? And it turned out that a majority of companies who said that they did not, in the terms of service, actually did. And so, do you want to see how, you want to guess why that might happen?

Casey Q: I would say that, rather than it being active malfeasance on the part of developers, I think that developers don't know the back end of this well enough to be able to say anything clearly about their terms of service or even where their data is going.

Andy Coravos: Yeah. A very common thing is that the terms of service are often written by the lawyers in a group, and then the engineers in the group often aren't necessarily reading the terms of service, or doing regular check ins with the legal team to say, what are we promising our customers on the other side? And then also reading the legal agreements of all the software packages that they're putting in. So, I think we'll probably talk about this around Defcon, but having a software bill of materials, and then really understanding how each of the different components of software are working. And so, there's just this really big gap as to whether or not something is actually doing it, and whether or not the legal team who's writing the terms of service actually knows that the product is sharing or not.

Casey Q:  I did hear from our mutual friend, Andrea Downing, that she has been tagged as a speaker for the virtual Defcon this year in the Biohacking Village. What's on tap this year, if you know? Are you part of the planning and running Biohacking village team again, this year?

Andy Coravos: So, this year, I'm not part of the team that's picking the speakers. So, I don't have any tips beforehand, but I think it might be interesting to share a little bit with people about what is Defcon and should they go, and was it canceled.

Casey Q: Well, absolutely. I mean, Defcon, I did do a couple of report ins from my presence finally at Defcon, after wanting to go for a long time, last year. I felt very much at home at Defcon and the fact that the Biohacking Village was where I ended up – there were other places I wanted to go, but there's only so many hours in the day and how much running up and down of Las Vegas Boulevard, in August, would you really want to do? So, I ended up just sticking with the Biohacking Village. My impression of Defcon was that it was the most organic, nice chaos I had ever witnessed, and it was chaos for a purpose. There's a lot going on that I didn't really even get a chance to figure out, just because there was so much going on, and because it's divided up into these villages. People out there, if you're in the academic world, it's an interest group, they call it a village at Defcon.

The organized chaos thing is actually, to me, part of the appeal, because if everything's super programmed down to the nth degree, it's like, okay, so what are we really learning here?  Defcon is the world's largest hacker conference, somewhere between 30,000 and 40,000 people show up. This is the first year it hasn't happened live, but they are doing it virtually. It's just, it's sort of the ethos of hacking. I think people who don't really just dive in a little bit, even if they take a shallow dive into the principles of hacking, everyone's a hacker. To some degree or another, and this is Andrea’s big message, or one of them, is that epatients are hackers, they just don't know that yet.

Anyone who's been curious about how their own body works, and figuring out how to tune it, they're hacking. Anybody who has a device or a thing or whatever, and they wanted to change it a little bit, so it will work better for them, whatever it is? You're hacking. That's what hacking is.

Andy Coravos: Totally. And I think most people don't realize that there's largely three types of hackers. So, there's the traditional black hat hacker, which is somebody who exploits the system for personal or political gain, or just for fun. And this is definitely the cyber-criminal part of the world where they don't care about the ramifications to whatever they're working on. They're in the lulz, with a Z. And then there's white hat hackers, which are ethical hackers who know the difference between exploiting, and finding, a vulnerability. You may not exploit the vulnerability when you find it, and then you participate in coordinated disclosure and you let a company know that you've hacked something.

And then there's gray hat hackers, which is where it gets a little confusing as to maybe one group thinks it's white and one thinks it's gray, and you're working through whether they're participating in illegal activity or not. And so, an example of this was a number of people at the Biohacking Village were doing some security research on pacemakers and they found a pretty significant vulnerability, which is that pacemakers try to maximize the amount of battery life that they have, which means they don't do anything computationally expensive.

Turns out encryption is computationally expensive, so they just weren't encrypting the protocol of the pacemaker. And so, what they could do is pacemakers can stay in a low power mode and they would ping the pacemaker to wake it up in a high-power mode. And then that would drain the pacemaker battery from years to a much shorter time period.

Second thing that they demonstrated that they could do is pacemakers can deliver a shock upon a cardiac event to re-stimulate the heart. They could also deliver the shock upon a non-cardiac event, which could potentially kill somebody. So, when they found this, they went to the device companies and said, "Hey, we found this. We'd like to disclose to you in a coordinated, ethical fashion." And the device companies effectively said, thank you. We're going to sue you now, because we you're tampering with our product and now, we sue you. And so thankfully the leaders in the Biohacking Village said, “Hmm, this seems weird. If I find a vulnerability in Snapchat or Facebook or Google, there's a coordinated disclosure program, I tell them about it, they ship an update.”

And the device company said, "Hey, we can't just ship updates. We're regulated by the FDA. We can't just ship whenever we want." And there a little warning bell that went off for some of the security researchers was white hat hackers, which said, I don't think this is right. And so, they went to the FDA directly, which is very rare. And I think now more and more of these hackers are starting to work with regulatory agencies and governments, and those governments have started to come to Defcon, which is very neat. Cause Defcon’s growing up.

Casey Q: Yeah, there was an FDA Commissioner who was there yet last year. It was great.

Andy Coravos: Yeah. And they started to attend. So, three years ago was the first time that the FDA attended Defcon, and they're like, " Well, actually the device companies are wrong. They have to ship an update and they're mandated to ship and update and what's Defcon and can we go?" And so, the Biohacking Village started bringing the FDA. They protected them. They gave them speaking slots. And it's really neat to watch the agencies start to work with the researchers. They ended up hiring a number of researchers out of Defcon into the agency, and I think this is a really powerful collaboration. And that's actually how I started to get involved and served at the FDA in the digital health work unit, working on software as a medical device and some of the security work that they were doing.

Casey Q: This ties back to my question earlier about the COVID-19 remote monitoring tools project that your company is running. 'Cause I think that there's sort of a virtuous cycle that gets set up if device manufacturers, be they FDA approved devices or consumer devices, is that if everybody can start working together on eyeballing things, finding things that work, adding them to the arsenal of things that they put in their stuff, whether it's a pacemaker or a smartwatch, or a smartwatch that's a pacemaker. That's probably coming. It's just not here yet. Where do you see us on the trajectory of getting to the idea that the line's going to be blurred between what is considered consumer and what is considered a regulated, regulatory capture device, and whether or not we can get to the point where it's all in one bucket.

Andy Coravos: Yeah.  I've a lot of complex thoughts, and I think this is something as society will have to work out. So, part of me wants to put the patient in the driver's seat. I want patients to have as much data as they possibly can to make their own decisions. The other part of me says, there's a number of studies that show people who do quantified self projects have increased anxiety, decreased wellbeing, and they're really stressed by tracking all this data, especially if they don't have a clinical reason to do so. So, for example, during COVID I had a friend of mine who is in New York City. She called me extremely upset. Her temperature had spiked. She was feeling terrible. She was in New York where there was a huge outbreak and she was sure that she had COVID.

Turns out only one third of people, there's a big JAMA study that was on this, who are tested positive with COVID, who are hospitalized have a temperature. So, two thirds don't have temperature. So, temperature is not actually a great reading. There's tons of reasons for that. Maybe they took Tylenol to decrease the temperature. But it turns out for my friend, she was ovulating, and she never knew her baseline. And so, I think there is a big challenge with making sure that data is democratized, but also that people have clinical oversight, so that we don't cause even more of a mental health burden on people. And so I don’t really know the answer for this. I don't really want to withhold things from patients.

I think there've been a lot of times where things have been withheld, think about the pregnancy test and how long that was withheld from people being able to use it. But I also think it is kind of irresponsible to just share data if we don't necessarily know that there's a clinical reason to do so, especially for less knowledgeable people who might get confused, that they think they have COVID and getting anxiety around that.

I've always been kind of worried about how to message that, because sometimes it seems like it might be paternal. So how do you decide as a company when you share and when you don't share with people? Or how do you provide enough information for somebody so that you’re thinking about it, but you're being responsible in your sharing. And it's not easy.

Casey Q: No, it's not easy. And I think that that's where things like, you know, patient advisory boards, you know, patient focus groups and, you know, just sort of a continual outreach to the patient community, whichever patient community it is that is, you know, whose condition or, you know, whatever is being assessed, addressed, tested, you know, tracked whatever in this.

And if it's just a public health issue, then you can invite anybody you want. Yeah. But depending on what the situation is, having the end user in the room, when you're making decisions about like, okay, well, how much do we say, and how much do we share? And everything down to the terms of service.

And, at that point, then once you've built it, whether or not the people who might find it useful will actually A find it useful, and B get something out of it, because the “if you build it, they will come” thing that I've seen prevalent in way too much consumer-facing tech , whether it's healthcare or whatever, but particularly healthcare, when people build stuff and it's "Oh yeah, this is going to be awesome. People are going to love it." And it's like, "Wut?" You either didn't explain it, or the way you're making money off this kind of feels weird to me, because I figured it out.


Andy’s work is definitely not in the “if you build it, they will come” zone. Everything I’ve come to see, and know, about what she and Elektra Labs are up to tells me that I can trust her word on emerging ideas, and follow her guidance on stuff either in the pipeline or out in the wild that intersect with human health and the technology that’s woven around it at the point of use - the human themself. 

Oh, and speaking of Defcon, ‘cause I kinda never shut up about it, there were many superb sessions in the Defcon Safe Mode Biohacking Village. 

Two I particularly enjoyed were Lucia Savage talking about emerging privacy legislation, and how to influence the discussion and debate about it; and Andrea Downing talking about how patient communities can use threat modeling to prevent the spread of medical misinformation online. 

Next week my buddy John Wilbanks will be back to talk about what's going on in his world of making data sharing easier and safer, from his POV at Sage Bionetworks.



Danny van Leeuwen, also known as Health Hats - with his diverse and prolific health experience, Danny uses his multiple hats to empower people as they travel toward their best health. To join Danny on that best health journey follow HIS Health Hats podcast on iTunes - it’s terrific!

Doco.la is the care communication platform that keeps the trusted relationships and meaningful conversations between healthcare providers, patients, and families going, even during a pandemic. Clinical teams bridge the care-gap between visits by e-prescribing high quality patient information and education materials from the Doco.la marketplace, and contribute their own materials, too. Let’s call it “parrot-proofing for medical professionals” since that’s what Doco.la is! 


Patrons include:

Eran Kabakov of Doco.la

e-Patient Dave

Janae Sharp

John Ruane

Alicia Staley

Bill Adams

Sharon Terry

Lyubov Lytvyn

Don Goyette

Mary Gurney

Bonnie Cranmer

Cindy Geoghegen

Marilyn Mann

John Wilbanks

Janice Lynch Schuster

David C. Norris

Ileana Balcu 

Jack Barrette

Jim Garrison

Diane L. Harris

Gigi Peterkin


Hilary Piland

Nick Dawson

Cece Casey

Suzanne Tormollen

Helen Bartley

Wanna be on this list? Become a Patron!



Open + close:

Movin’ On Up by Podington Bear


Music from https://filmmusic.io

"The Show Must Be Go" by Kevin MacLeod (https://incompetech.com)

License: CC BY (http://creativecommons.org/licenses/by/4.0/)

Podcast distribution rights:

Creative Commons 3.0 license

Become a patron to

Unlock 2 exclusive posts
Listen anywhere
Connect via private message