In this episode, Ethan shares his insights about

How can you ensure success in an industry that is at once old-school and strict while at the same time hugely innovative and ever-changing?

And how do you make sure your clients get their documentation in time for the EU CE mark application, while it needs to be very high quality and fully compliant?

Operating partner of Citemed, Ethan Drower, talks to Subdi on the Combinate Podcast about everything from the major challenges when developing regulatory software to his smartest collaboration and the success that has led to.

Listen to Ethan tell the story himself in this podcast!

Listen now

https://youtu.be/zSzihYk1ago

Like this episode? Subscribe to our channel today on Youtube.

Links from this episode

Citemed

Success Made to Last Podcast – Spreaker

Some of the highlights of this episode include:

  • A problem-driven career
  • Challenges with EU MDR Clinical Evaluation Reports(CER’s)
  • Literature Search and Literature Review

Episode Transcript

This transcript was generated using an automated transcription service and is minimally edited. Please forgive the mistakes contained within it.

Subhi Saadeh: [00:00:00] You’re listening to another impactful episode of the Combinate Podcast, the show where we strive for quality in everything because quality is everything. I’m your host Subhi Saadeh. I’ve been working on medical devices, pharmaceuticals, and combination products for the last ten years, and my goal is to understand. Each week I sit down with leaders to understand and bring together medtech and biotech in order to examine the roadblocks in development and access we face and bring to light concepts and tools from our industry and others to help address those. Thank you for joining me, and I hope you enjoy this episode. Hi, everybody, and welcome to this special episode of the Combinate Podcast. I’m your host Subhi Saadeh. And we are graced and honored today by Ethan Drower, who is joining us from Mexico. Welcome, Ethan.

Ethan Drower: [00:01:13] Hey, so we’re so glad to be on here. Thanks for having me.

Subhi Saadeh: [00:01:16] Yeah. Thanks for coming on. So Ethan’s going to educate us today about literature review, a topic that I don’t know a whole lot about. This area is a little bit interesting for me because working in combination products, there’s a clinical element of the drug entity. But then there’s also that’s now sort of gained a little bit of steam and has elevated in importance in some way and has increased in definition from the update in the MDR. And so you all are focused on MDR primarily, right?

Ethan Drower: [00:01:54] Yes, Correct.

Subhi Saadeh: [00:01:55] Yeah. So I guess before we jump in, you know, Ethan and I were talking about his background, and I always laugh because if it’s somebody who has a background or is coming from a software-based, you know, it’s kind of like following a spaghetti, trying to figure out what the timeline is. So I’m not even going to try. Ethan, if you could tell folks your kind of story up until now.

Ethan Drower: [00:02:19] Yes, absolutely. And you’re definitely right about software folks taking very, very confusing paths at times. And I like to say that my career has been very problem-driven and not directed by necessarily my specific goals. So when we latch on and come across an industry that has a problem that can be solved with software, with a little bit more organization, with some algorithmic thinking, that’s really when I start to dig in. So, naturally, when I was younger, it was the financial industry that there’s lots of problems that can be solved. I also got into the kind of consumer app space for a bit, did a few passion projects in the speech therapy world. That’s a very interesting set of problems, especially for children that struggle with improving their speech. And then, about 4 or 5 years ago, I came across this medical device industry and specifically the regulatory industry. And you know, I kind of found a home here and really enjoyed not just the problem but the people and the industry as a whole. I think this is a very special industry with a very unique set of folks trying to solve the problem. And they don’t do it for the glamour, right? They do it. They have a much more shared mission and a shared appreciation for doing things the right way and haven’t really found that in other industries as much as I have in the regulatory world. So, I’m loving it.

Subhi Saadeh: [00:04:02] What about the medical device space particularly? I mean, okay, so I hear what you’re saying, but. When people go into software engineering, right? They’re not thinking, I’m going to help compliance folks with this problem that they have. So. Right, So what about not because I hear, okay, that’s the medical device industry, but particularly even people who study biomedical engineering don’t want to get close to that part of the industry sometimes with a ten-foot pole as they’re exiting school. Right. They want to work in a lab and stuff like that. And so anything there that attracted you?

Ethan Drower: [00:04:42] Um, I think the difficulty and the perception of it being kind of a very boring and stiff world attracted me.

You know, as part of growing as an entrepreneur, you start to gravitate less towards the really popular thing that everybody’s trying to build, such as self-driving cars or blockchain. You start to become more intrigued by an industry that everybody has written off for some reason. We found that there were interesting data problems that can be solved or addressed in this kind of industry. It’s not one that TechCrunch and Silicon Valley are trying to go crazy on; they’re not trying to change the world of it. But at the same time, it has a massive impact on people, on humanity. It’s kind of the perfect intersection of being written off a little bit, but also being a hugely important piece of the health space of the device world. Without people doing the paperwork and getting these things to market, we can’t innovate. We can’t actually help patients, which is the whole point. So that’s what intrigued me. It was a big, juicy market that nobody in the tech world really cares about, or sees the potential in, in my opinion.

Subhi Saadeh: [00:06:13] Yeah, you’re probably not going to be giving the keynote at CES, right?

Ethan Drower: [00:06:17] No. No, we’re not. No, we’re not going to be on TV or anything like that because people think it’s boring and it gets written off in budgets. People have been cast into the basements for years to do their paperwork and be quiet. That misconception is a huge opportunity for innovation.

Subhi Saadeh: [00:06:43] Yeah, I found that to be the case with RA. I’ve recently gotten really interested in RA. But more recently, Human Factors was an area that I found to be kind of boring. I knew it was important and I had been part of a bunch of human factors studies and had some solid experience in the area. But it wasn’t until I started talking to leaders in the space and reading some of the stuff that they were giving me that I started to have more of an appreciation for it. I think it’s the “show me your friends, I’ll show you your future.” Regardless of whatever subject area, if you surround yourself with people who are into it, then it’s electric. They’ll convince you that it’s not before you convince them that it is boring. Also, I think having a problem-driven career is a really cool way to map out your story, regardless of industry. How did you pick what problem you were going to focus on?

Ethan Drower: [00:08:07] Within medical device?

Subhi Saadeh: [00:08:09] Generally speaking, because you said your career is problem-driven, right?

Ethan Drower: [00:08:12] So, did you generally get me? So, generally, it comes down to the people. You know, when you meet somebody, and you find out that you have a very inquisitive nature, you like to dig in just out of curiosity. When it came to medical devices, it was easy because it was my father who tipped me off to this problem. He was working for a major device manufacturer, and we went out to dinner one day. He was complaining to me about EMDR and how he gave a big presentation about how there was no way that he and his staff of two people were going to do 100 literature reviews and maintain them over the long run. He put this whole thing together because it was years out, a year or two ahead of the actual implementation. And he was just whining because nobody listened to it, and they just said, “Yeah, we’ll get it done. You guys work quick, so it should be fine.” That’s when it piqued my interest because I said, “So, every company has to do this?” And he said, “Yeah.” Then he showed me a report that he was working on, and he asked me how much of it we could automate or improve with software. I said quite a bit, and that’s how I got hooked into this problem. I did some more research and networked in the industry. Regulatory people are great; they’re happy to talk about their problems with you. We got some good validation of what we were trying to improve, and we just went for it.

Subhi Saadeh: [00:10:09] Wow. So, your partner is your dad?

Ethan Drower: [00:10:12] Yeah, he’s a co-founder. He’s a major partner in the company.

Subhi Saadeh: [00:10:18] That’s so cool. I heard you talk about your co-founder having 30 or 40 years of experience. I didn’t realize that it was your father. That’s so cool.

Ethan Drower: [00:10:33] It’s one of the few partners you can truly trust in business. So, um, it’s been a dream. Working together and collaborating on this and being able to leverage his insights is just a huge advantage.

Subhi Saadeh: [00:10:49] Yeah, it’s funny, you know, I’ve heard a lot of people say avoid family business, but my father-in-law has a lot of family business. And I was telling him, you know, don’t people say you should avoid that? And he goes, no, don’t avoid it. Family business is the best business, but you have to have clear roles and responsibilities. Most people don’t write things down. Right? So that’s so cool.

Ethan Drower: [00:11:12] Yes, you definitely have to be clear on who’s doing what and what the expectations are. And things change. So both parties have to be open to if we come across problems and things need to change, both parties need to be open to it. So it’s more the personality than it is the relationship. The relationship only helps you.

Subhi Saadeh: [00:11:30] Yeah, that’s sick. So now let’s move on to the problem. Because literally literature search is what I wanted to talk to you about. Yes. As you were going around talking to folks, my experience is that it’s medical safety, clinical folks that are doing the literature searches, not RA. And so maybe if you can describe what is the problem?

Ethan Drower: [00:11:56] Well, that is true in companies that have bigger departments where they can separate it out and they have an actual clinical evaluation team. A lot of our clients, though, regulatory clinical is a team of 1 or 2 people. So a lot of people we work with, it’s this is an extra responsibility that’s kind of been added on, on, you know, with MDR and their company’s not going to hire extra people. So that’s been the crux of it. Within the bigger companies, it’s a clinical evaluation team. And what they want is they want to be more organized and they want to be quicker. But the smaller manufacturers, they just want to survive and get it done without blowing up the budget.

Subhi Saadeh: [00:12:42] So what is the problem?

Ethan Drower: [00:12:44] So the problem is with the transition to EMDR, the clinical evaluation standards have shifted heavily towards literature, towards scientific literature. And they are now being scrutinized at a much deeper level than they were previously under med dev. So what you could get away with in terms of going out there and searching for relevant literature, finding some studies that you use your device and then making some claims based on what you’ve read, which you could do before, you could get away with a pretty unorganized approach. And now the notified bodies in Europe are just hammering people on the conformity of their literature searches and then how they integrate that into the clinical evaluation report. So it’s kind of like the notified bodies’ focus has shifted thanks to EMDR. And now people are getting non-conformities, they’re getting red flags because what they did under med dev no longer flies. And that’s the crux of the problem.

Subhi Saadeh: [00:13:54] Is it as simple as formatting, or are there additional requirements they need to adhere to?

Ethan Drower: [00:14:02] The additional requirements stem from two major components. One is the thoroughness of the search, and the second is how easily your search can be reproduced. Under med dev, you could pretty much Google some terms about your device, throw it together in a report, and say, “We did some searches, we looked at some studies. This is what we found.” But now, you’re going to get a nonconformity from your notified body if they can’t reproduce your exact results with the protocol you’ve provided them. You have to be specific about which terms, which databases, how you’ve evaluated each abstract, when you reviewed the full text, how are you going to extract the relevant data out there that supports your indications, and everything has to be validated. The most common keyword that we get when we see notified body feedback is “validated.” Manufacturers that were just doing some ad hoc searches, threw it together, stuffed it in the car, and forgot about it are realizing now that it’s a much more thorough process, and it takes a lot of time to organize it in a way that somebody can review and say, “Okay, I understand your process. This all makes sense. I can reproduce it.”

Subhi Saadeh: [00:15:39] So the replication part of the process is super important then?

Ethan Drower: [00:15:43] Yes, it’s incredibly important. If you submit without a protocol that they can follow step by step, if they can’t do that, it’s going to come back to you. They’re going to send it back and say, “We couldn’t reproduce your results.”

Subhi Saadeh: [00:15:55] Okay, interesting. What does the validation process look like?

Ethan Drower: [00:16:06] This is where it gets very interesting because you’re dealing with public databases, and public databases are always changing their journals that are available. The process that I show you that says, “This is how we searched it. Here are the results that we received on this day, and this is how we processed them.” That’s what matters in terms of validation. That can be submitting the actual raw files of the search results with timestamps, and with some databases, we even submit screen captures to prove that at this point in time, this was the result counts. If that has changed in six months because the database has changed or data has been corrected, we want to be very clear that we can prove the results we’re showing. You were what we saw the day we ran those searches, and that’s been central.

Subhi Saadeh: [00:17:13] So one of the important bits of a literature search is continuously evaluating your risk benefit profile. Right. What about that?

Ethan Drower: [00:17:27] That’s right. That’s kind of where updating comes into play. It’s not just about going out there and collecting the data and then sayonara. The literature search itself. Yes, you have to collect the data. It needs to be extracted. It needs to be read and understood. Then you have to decide how does this data affect what I’m trying to say for my safety and performance indications? That needs to be a continuous process. So year after year or every two years, depending on your classification of device, you need to be performing these searches, understanding them and making a decision as a manufacturer. Has what we found changed what we are claiming? Are there any additional risks that have come up that we need to now address? And that process needs to be kind of reciprocal. You need to continuously be doing it or at least showing you are.

Subhi Saadeh: [00:18:27] I guess what I’m trying to understand is literature search is a part of the CR, right? Where does things like incident reporting adverse events and those things feed in?

Ethan Drower: [00:18:42] The adverse events, we roll it into our literature searches because we want to package it and do a complete thing. But that’s just a different part of the CR. So you need to not just be searching clinical literature databases like Embase or PubMed, but you need to be looking in specific countries, public adverse event databases like FDA’s MOD. You need to be searching those and reviewing the actual adverse events and recalls that have been submitted. And you need to show that data. That data needs to be reflected in your CR, but you also need to show a process for continuously doing that. So every year, every two years, depending, you need to file reports that are updates saying we understand our clinical literature, we understand our adverse events and vigilance reports and hopefully everything’s okay. Nothing has to change with our risk benefit. Nothing has to change with our claims, etcetera. But that process includes them both. So that’s a really good question.

Subhi Saadeh: [00:19:55] Where does Udemy fall in?

Ethan Drower: [00:19:55] Well, when it’s working, it will be a place that manufacturers can submit reports. So it’s supposed to be a centralized database of these adverse events. All of these things currently, that is not happening. Currently, you are still responsible for doing them on your own and working with your specific notified body on how they want to receive those updates and reports. But the idea behind it is great. We are skeptical that the implementation is going to be as smooth as it’s been promised. So right now, we don’t do anything with it for our clients.

Subhi Saadeh: [00:20:43] So okay, so then, so then the problem if, if I.

Subhi Saadeh: [00:20:46] Can summarize it from, from what I’m understanding is the literature search and your pool of data is constantly changing with new studies and new results coming out. Um, and that there’s increased scrutiny around this activity that companies have had to do historically and, and an effort to ensure that you can replicate the results in kind of an ever changing landscape has become much more important. And so how does software solve that?

Ethan Drower: [00:21:13] Correct. So, yeah, you’ve pretty much hit the problem. The problem is you need to do a better literature search and that takes a significant amount more of man hours, which provides proves a big problem, poses a big problem with staffing, with, with budgets, etcetera. Um, so what we have tried to do is we’ve tried to build tools that help ourselves and anyone that is conducting these reviews, um, the majority of the time that it takes for these reviews comes from the organization and the formatting of the data. So. There’s a big opportunity to not just try and throw everything in Excel and sort it yourself and copy and paste. Etcetera, etcetera. There’s a big opportunity to use tools that are going to organize your literature, help you quickly review them, quickly run through them and perform your assessments, and then at the end generate an output that’s actually usable instead of one that you’re going to have to now export, try and put into from Excel, put it into word, try and refit all the tables. Etcetera etcetera. You know how easy it is to make these copy paste type of mistakes, and especially if you’re going to be talking about working with thousands of potentially thousands of articles. So the the formatting and the organization. And then over time you have to continue to maintain these things. So it would be nice to be able to see what you’ve searched previously, what your previous results were and how you assessed them. So it becomes over time, you want to start accumulating, you know, some trending data, you want to accumulate some historical data about your your literature. And it would be nice to store that in a place that’s actually accessible, not just hundreds of PDFs on a share file somewhere.

Subhi Saadeh: [00:23:07] Okay. So I understand what you’re saying. You’re saying that, um, in order for you to be able to do the new activities effectively, part of it is sort of having a validated output. Um, that allows you to organize the results in a way that you can just pull them and put them into a, into a type document. But the, but the artifact is there if you ever need to go back. And then there’s, it sounds like there’s a querying activity that’s that’s also running in the background to see what all is changing. Is that right?

Ethan Drower: [00:23:40] Correct, rght. Because once you’ve done your first literature search and review, you’ve submitted your everything’s great. Now you’re going at some point you’re going to need to update it. And how can you effectively do that without starting over? Okay. How can you pick up where you left off? Right?

Subhi Saadeh: [00:23:57] Yeah, so that makes sense. So in other words, the initial is always going to be heavy lifting. But, you know, essentially a big part of the problem that, that you’ve. You’ve potentially solved is having to essentially redo an initial report every single year.

Ethan Drower: [00:24:21] Correct. How do you incorporate changes to the protocol? What happens when you want to change keywords, etcetera? How are you going to track that over versions? Um, and yeah, I mean, it’s still it still takes time to search, pull the data, organize it, remove the duplicates, keep track of all of the counts, you know, for each database and each search term. You want to record the results and you got to put all of that in a table that’s easily read and understood by your notified body auditor. And you’re going to need to do this each time you run these searches. So it’s just mistakes can be made if you’re going to do it by hand each time. And if you’ve got a big portfolio of devices, that’s a lot of that’s a lot of spreadsheets moving around between departments, etcetera. So that’s kind of where we found the biggest value add for for our tools is, is in the organization and the the time optimization aspect.

Subhi Saadeh: [00:25:17] In terms of the actual review though, like what all goes into reviewing um, the output.

Ethan Drower: [00:25:27] You mean you’re referring to the process of the literature review?

Subhi Saadeh: [00:25:31] Exactly. Like. So everything that you described is around sort of controlling the the inputs into the review. There’s a there’s an analysis that has to be done on the content of the review. Correct. Yeah. And so what about that?

Subhi Saadeh: [00:25:51] So that’s that’s also a decent.

Ethan Drower: [00:25:54] Chunk of time because you’re potentially gathering, you know, 1500, a few thousand abstracts and they have to be read through and they have to be classified. They have to be included or excluded. And for those that are relevant, you have to do a much deeper analysis on them. And how are you going to keep track of a specific article and its relevant assessments throughout this process? And that’s a big part of what we do is we have an interface that makes it easy to to flip through, remove the the abstracts that are not even close or are completely, you know, irrelevant. And and then we also make it easy to once you found articles that you like to drill down into them, perform your your extractions, your more detailed assessments of the actual study itself and and then store all of that data so it can easily be shipped out later.

Subhi Saadeh: [00:26:50] So so yeah. That, that, um. That process of, looking at, looking at whether or not it should be included or it should be excluded, it’s largely based on the abstract.

Ethan Drower: [00:27:04] The systematic literature review. Yes, your initial pass will be based on the abstract. So what you’re trying to do is you’re just trying to remove all of the things, all of the studies that have nothing to do with your product. So you’re just trying to remove the, you know, you’re trying to remove the junk through your first pass, everything That’s clearly, you know, it’s a non-human trial. It’s totally unrelated. Um, that’s, that’s the intention there. Everything that’s left after you’ve removed, you know, after you’ve removed all of the the articles that are way off, you should be, you should have whittled it down to, you know, some 100, ideally some 50 for most devices. Um, articles that should be good, have the possibility of being, you know, legitimate studies that that are going to support your claims. So and that’s when the second the second pass we call it in the review process, that’s when you actually dive in, go read the full, the full text of the article, and then you start to make your assessments. So some of those get excluded in that process. But it’s you’re starting with a much more narrow pool because, you know, you’ve trimmed the fat, so to speak.

Subhi Saadeh: [00:28:12] Okay. But historically, they would have had to have gone through every article in the first pass manually.

Ethan Drower: [00:28:18] Yes. I mean, you depending on how you export it, most people just stuff them all into Excel and then they try and wrangle the data to to remove the duplicate articles. And you want to count how many of those duplicates you remove. So that’s a that’s a bit of a, you know, a bit of an issue. And then what most people do is they just go to, you know, add a column and put their start putting their feedback in, um, in, you know, each subsequent column for each article. And that’s how they do it updates. It’s the same way.

Subhi Saadeh: [00:28:49] So, so essentially the, the manual review process combines the first pass and second pass all in one. If you weren’t if you if you weren’t doing some sort of software, then you would have to look at the article. Is this relevant?

Okay, What is it saying all at the same time? Because you have it open. Um, but if you’re, if you’re, you know, you called it systematic literature review. If you’re following, if you’re following that process, then you’re dwindling down to a certain amount, which is I think the number that you said is somewhere between 50 to 100. Much more manageable. And then, you know, there is a filtering process where you’re saying, is it relevant?

Subhi Saadeh: [00:29:33] But likely it will be more likely. Yes, more likely that it will be. And in that second pass, that’s when you do your evaluation. What are what are you looking for? Because you said, you know, does it support the claims in that? But when you’re opening up the reports, what are you looking for primarily?

Ethan Drower: [00:29:51] So when our writers read through abstracts in the first pass, they’re mostly looking for, they’re mostly looking for things that meet the exclusion criteria. So they’re not necessarily looking for the gems. They’re looking to find reasons why this study would be no good, for example, an animal study or something like that. Um, beyond that, they’re trying to find and this always depends on the device. They’re trying to find studies that ideally have focused on. The performance of the the target device. And if not, they’re looking for studies where they could potentially support the the state of the art claim, which is which is kind of a new thing under EMDR. So, um, they’re essentially they know what they they know what the indications are. They know what they want to say and they’re trying to comb through, you know, the haystack to find good quality studies that. Have data that’s usable because, you know, there’s a lot of on on these bigger databases. There’s a lot of junk that comes through, um, that are, that are not as scientific as they should be. And you can’t, you can’t use anything that comes through. It needs to be defensible as, as a properly run, properly run study with, with data that you can trust.

Subhi Saadeh: [00:31:21] Understood and then mean so so going back to the validation bit. How do you how do you address that now? I mean, I understand what you’re saying. The first pass, you know, removes things that are irrelevant. The second pass, you can now look in detail and through the software you can maintain what the what the outputs are of the review of the study. Right. So you have it all in one place. Correct. Okay. So but how do you how do you validate how do you replicate that? And you know, is there a sort of, uh, check the checker activity where somebody is also looking to see if they get to the same conclusion of the review of the study? Does that make sense?

Ethan Drower: [00:32:04] So it’s less about, you know, the classifications they are they’re always going to be based on on a human’s decision. Right on the writer’s decisions. So the notified bodies generally don’t they don’t argue specific classifications they more so take a broader approach saying do we trust this person’s credentials who’s performed the review? And and then number two, can we reproduce them so they’ve it’s rarely nitpicked a specific. Study and they say, Oh, we disagree. You shouldn’t have included this one. We don’t really see a lot of that.

Subhi Saadeh: [00:32:46] I guess before you before you go further, you said the word classifications. What does it mean in this context?

Ethan Drower: [00:32:53] Classification would be whether an article is included or excluded for the most part. Yeah, we kind of just say any, any, um, any opinion that’s being put on, you know, any opinion in the process that’s being made by our, the writer, we kind of just refer to it as a classification generally right there. Yeah.

Subhi Saadeh: [00:33:15] Yeah. And, and then, you know, you mentioned that because in, in medical devices, particularly if you look at the 820 or 1345, um, there’s, there’s the issue of, of training as well as competence, right? So that’s, that’s what you’re talking.

About is they question if you don’t have a system and you don’t have credentialed people doing the activity, that’s where it becomes problematic. But they don’t question the judgment by a credentialed person who’s following a systematic approach. Right. That’s what you’re saying, correct?

Ethan Drower: [00:33:44] Yes. It’s been very rare in our cases.

Subhi Saadeh: [00:33:46] What so what is what is the what are the credentials for a medical writer?

Ethan Drower: [00:33:52] So, generally a person that’s conducting or literature review activities, they need to be able to show a good amount of. The notified bodies will call it clinical experience. However, we found a pretty broad stroke of know. We found a broad variety of qualifications. So that can be anything from medical doctors to to regulatory affairs, people that have been writing for decades. So that’s one of the questions that’s most commonly asked is, you know, are you do your writers pass the pass the the exam pass the qualifications? Because it’s not explicitly clear for anybody that’s not a medical doctor, it’s not explicitly clear who is considered qualified.

Subhi Saadeh: [00:34:43] There’s no kind of certification expectations.

Ethan Drower: [00:34:48] No. There’s nothing that’s specific to clinical evaluation. Maybe there will be. And then I’m not aware of that they’re working on. But as it sits right now, you you submit a CV along with your reports of who’s performed the the writing. And if the notified body has a problem with their qualifications, you have to send it for review through someone else that they’d approve.

Subhi Saadeh: [00:35:15] Oh, interesting. Okay. So it’s a bit of a gray area.

Ethan Drower: [00:35:18] Yeah. So it’s a situation where you you have experience and understand what’s right, but there is no, um, wrong per se.

Correct. Outlining regulation.

Ethan Drower: [00:35:32] If you can defend your CV and prove your competence through your experience, that’s you know, we found the notified bodies to be pretty fair when it comes to that. But, for example, if you’re just an engineering background person and you’ve, you know, you’ve helped build this device, it’s it’s not going to work. You’re going to need a medical or regulatory background, and it’s going to have to be fairly extensive to the tune of over five years.

Subhi Saadeh: [00:35:59] Very cool. Um, it’s that’s so interesting. Uh, where do you where do you see this going in the future? I guess, as, as you build it out.

Ethan Drower: [00:36:11] Um. I would hope that. You know, right now our tool primarily serves it serves our company because we service a lot of clients. And then it serves a, you know, a good amount of individuals that that recognize the need to kind of optimize and keep their data organized. What we would like is we’d like to build a platform that is far more robust and supports. Entire departments within larger medical manufacturers. There’s a lot of there’s a lot of shuffling that occurs within big companies between departments because there, you know, a lot more decentralized. And we see the biggest opportunity there for for helping them perform these and keep everything up to date on, you know, a 500 device portfolio.

Subhi Saadeh: [00:37:01] Awesome. I guess as we close, what is something you’re excited about?

Ethan Drower: [00:37:07] Well, I’m definitely excited to do. Fireworks. Sorry.

Ethan Drower: [00:37:15] I’m definitely excited to see this industry continue to increase its sophistication and continue to increase its trust and reliance on technology. Um, there’s, there’s been a lot of hesitance when it comes to different tools and everybody’s trying to build an artificial intelligence type of thing. Um, and there’s a lot of pushback when it comes to using technology to help make these regulatory and safety decisions. I think that that over the next ten years, it’s going to change because hopefully there are companies that produce tools that can actually be can be validated and can, um, can be accepted across the board. So I do think it’s going to be a bit of a race to what’s who’s going to build the things that are most accepted and in uniformly, you know, trusted. And that I think is a very exciting challenge that we’ve got ahead of us.

Yeah, I think, um, the, the software validation piece is super interesting and is of utmost importance. I mean, validation is a sort of a cornerstone concept in.

Subhi Saadeh: [00:38:27] The medical device industry. So whenever. Yes, whenever. Excuse me, whenever you’re, whenever you’re working on a tool like that.

Ethan Drower: [00:38:35] Um, once it’s vetted out and out the gate, then, you know.

Subhi Saadeh: [00:38:40] Well, you know.

Subhi Saadeh: [00:38:42] Um. Yeah. So show your work. Yeah, exactly. Show your work. Last, last question I have for you is what is a book that changed your life or one that you recommend?

Ethan Drower: [00:38:55] Oh, okay. Um. Let me think here.

I would say from from the entrepreneurial side and I hate saying this book on podcasts because it has the most shallow, terrible title. Um, it’s a book called How to Get Rich by Felix Dennis. And the reason he wrote it, he put that title as kind of an ironic, um. You know, kind of an ironic jab at the industry of trying to make money and build companies. And he spends the whole book. Expressing his philosophy towards why business and money are not the true pursuits that are going to fulfill you. So it’s kind of a funny, it’s a funny, silly title to read. But when you read his prose, you actually start to understand what he’s trying to say. And for me, that helped. It helped clarify exactly why we were in business and what our focus should be. You know, and like I said, we want to be problem focused. We don’t want to be we don’t want to be money focused. We don’t want to be fame focused. Those things can come when you build the most useful things, the most useful tools that help the most people. So that shift in that mental shift for me that happened years ago, I think was was a very pivotal, you know, a very pivotal moment for for my entrepreneurial career specifically.

Subhi Saadeh: [00:40:29] Yeah, it’s no different than like that. I think it’s called the happiness.

Paradox, right? The less you try to be happy, the more the happier you are type thing.

Ethan Drower: [00:40:37] Exactly.

Subhi Saadeh: [00:40:38] That’s really cool. Um, any final thoughts?

Ethan Drower: [00:40:44] No, I think this was a good one.

Subhi Saadeh: [00:40:45] Awesome.

Ethan Drower: [00:40:49] I’m glad. No, the honor is mine.

Subhi Saadeh: [00:40:51] How can people reach you?

Ethan Drower: [00:40:53] So you can you can find me on LinkedIn.

I’m very active there and we post a lot of the team’s articles. Um, and you can also find us at our website, which is just citemedical.com.

Subhi Saadeh: [00:41:08] Well, thank you, Ethan, for coming on to the show. It’s been an honor.

Ethan Drower: [00:41:12] Thanks so much for having me.