Bridging the Gap: Academia and Industry in Wearable Tech with Dr. Jessica Zendler

Bridging the Gap: Academia and Industry in Wearable Tech with Dr. Jessica Zendler

Dr. Jessica Zendler's role links academia and industry, aiding companies in grounding their products in science and assisting sports groups in embracing research for enhanced player health. Join us as we discuss her transition from academia to industry, the growth of wearables, the complexities of assessing human movement outside a lab, and striking a balance between innovation and quality in tech development.

Nov 29, 2023
jessica zendlerjessica zendler
jessica zendler
Bridging the Gap: Academia and Industry in Wearable Tech with Dr. Jessica Zendler

Bridging the Gap: Academia and Industry in Wearable Tech with Dr. Jessica Zendler

Dr. Jessica Zendler's role links academia and industry, aiding companies in grounding their products in science and assisting sports groups in embracing research for enhanced player health. Join us as we discuss her transition from academia to industry, the growth of wearables, the complexities of assessing human movement outside a lab, and striking a balance between innovation and quality in tech development.

Episode Highlights

(01:47) Jessica's Academic and Professional Journey

(05:04) Measuring Human Movement

(12:36) Balancing the Scales: Academia vs Industry

(15:08) Tech Talk: Quality vs Novelty

(20:09) Unraveling the Quality Framework in Sports Tech

(27:55) Human-first Approach in Science


Quality Assessment of Sport Technologies’ Special Interest Group White Paper

Rimkus Sports Science Consulting

Connect With Jessica Zendler


Last medically reviewed on
Nov 29, 2023
All-In-One Research Solution for Real-World Data Capture

Labfront is a global startup specializing in health data analytics. It is currently disrupting academic health research through its Labfront platform, a code-free digital biomarker collection and analytics solution. With the recent explosion of sensors in the scientific community, Labfront is helping health researchers process the overwhelming amount of complex data and transition to the data-rich future.


Jessica: One of the most important things to remember with a lot of these wearables as we move away from the lab, we're moving a lot of times away from the direct measurement and more into an indirect measurement of something.

So there's gonna be just inherently more error there. and you could just take a pierced approach and say, oh, that's really bad. I just need the most accurate thing. Okay, great. Then go to a sleep test. a sleep lab and do that. That's just not really practical and that's pretty expensive. So then we embrace, a wearable. It's less accurate. but I get to do it every day. I get to track trends. Sometimes it's more important to look at what are our trends over time.

John: Welcome to Human Science, a podcast exploring the human element behind the science that shapes our everyday lives. We're powered by Labfront, the go to tool trusted by researchers looking to automate their studies and transform real world data into health insights.​We have Dr. Jessica Zendler here with us today. Jess has navigated the realms of academia, startups, and now human performance tech consulting. We'll discuss her transition from academia to industry, how to best assess new technologies, and the challenges of balancing quality and innovation in the tech space. So everyone, please welcome Jess.

Jessica: Thanks so much for having me today.

John: Thank you so much for making time really out of your busy schedule. Full time mom, full time, consultant, you're doing so many cool things in the world of, human performance tech.

Jessica: Thank you. Appreciate that. It's well, it's fun to be on today, so I appreciate the time.

John: Awesome, Jess. Well, I was hoping we could maybe start with a bit of a high level And then we can really dive into some deeper topics.

Introduction to Jessica Zendler

Jessica: Yeah, that sounds great. So try to keep it pretty succinct. My background first and foremost is, is as an athlete, I've played sports and played soccer, kind of was my love all my life, played it through college. In college, I studied biomedical engineering, I was just really interested in the human body, and solving problems with the human body.

One of the first problems I saw playing soccer was that a lot of my teammates were getting pretty broken, and I wanted to understand why. Why did injuries happen? Was there a way to prevent it? A lot of them were having knee injuries, and it just seemed like there had to be some better way to to understand the athlete's movement and and prevent these injuries from happening or at least do a better process of rehabbing them back.

And so when I finished up college, I didn't really know what I wanted to do next, and I found out that, PhDs get paid for, so that sounded like a good deal. didn't really know what I was, signing up for with the workload there, but, decided to get a PhD in kinesiology, which is movement science, and mechanical engineering, and was really trying to just dig in deeper into just understanding the human body.

And so I spent some time really studying the human knee and understanding how movement changes with injury, and... fast forward, when I finished my PhD, I stayed on, whereas at the University of Michigan and I was part of the human performance, research lab there. And so I really had this, this awesome opportunity to, to work with the Michigan athletes, work a lot with industry.

We had a, especially a contract with Adidas where we're doing a lot of research on different products with them and got a chance to just start thinking more practically about. how do different products influence athlete movement? How can we measure athlete movement and try to understand signals for injury, signals for recovery, different ways that we can optimize performance through product or through training with the athletes?

In that time, we really started to use a lot of wearables because we were taking these really fantastic athletes and trying to test them in the lab and it just wasn't realistic. And so we started spending a lot of time, that was the early days of wearables and, you know, 2013Fitbit was still just becoming a thing, Jawbone still existed.

You know, we were just trying to understand how we could measure athlete movement. So really just got started there with understanding athletes and understanding how they interact with product and I got to lead that lab and eventually then set off on my own to leave academia and was briefly with a startup that was also looking at athlete performance through wearable technology and at a certain point decided to just go out on my own.

And so I've been consulting since about 2018 and I still carry an adjunct appointment at the University of Michigan, so I have a really close relationship there, and I still do research. But my main day job, is... is to help companies, to really develop out the science behind their product.

And then on the flip side, work with, with sports organizations to,embrace science and research to improve their processes, the products they're bringing into their leagues, um, how they're approaching player health.

I love science and I love athletes and, human movement. And so finding a way to bring that out of academia and make that really accessible to companies and the sports organizations is one of my major passions.

Measuring Human Movement: Gold Standard Lab Equipment vs Wearable

John: Well, that just gets me so excited because there's so many topics, actually, that I want to dive into, but for the sake of time today, I was thinking maybe we could start with just that concept of academia. So, I know you are now doing a lot of work outside academia, but maybe we could backtrack just a little bit to the academia side of maybe helping our audience understand, as you said, those early days, you know, 2012, 2013 wearables.

When it used to be, athletes had to, hook a ton of things up to their bodies. They had to maybe go into labs, you had to maybe potentially keep them overnight. Like, all these kind of unnatural ways to, to train and recover and take care of their body to get that data. But, what was that inside academia at that time?

In the sense of, how is it different now from using a wearable device and just being able to do whatever you're doing?

Jessica: Yeah, it's an interesting, interesting question and it, and in some ways it's still, it hasn't changed in a lot of ways. We still have, whether inside academia or outside of it, we sort of have our gold standard ways of doing things, but especially at that time there wasn't as many options to get out of the lab for sure.

And so our, the way if we wanted to measure human movement pretty much had to put markers on the athlete. They had to go on their skin. that meant that we were pulling up their shorts into basically a diaper and taping them up, you know, having them shirtless, having to do all this markering that takes about 30 minutes to do.

And then we have them in this lab space that's pretty confined. It's not like a field setting. We have them run and cut off a force plate and then immediately decelerate. And it's just not very realistic. But we didn't really have any wearables at the time that could measure joint movement in a good way.

Now that technology, inertial measurement units or IMUs, have really evolved a lot. As of now, there's more and more technologies that are coming on that are portable and get us out of the lab. And so it's, remains this trade off and this tension that I certainly live with every day because the methods that get us out of the lab usually are not quite as accurate or as reliable as our lab based methods.

But they provide us more realistic information. And so, it remains sort of a challenge, and I think academia is embracing it as well. There's a lot of labs now. You have your lab grade equipment, and then they do have these portable sensors that they're taking out and doing more field based studies with.

But it's always this tension of what is the end goal, and what are we trying to get out from that signal and how accurate it has to be, how reliable it has to be.

John: Yeah, yeah, it does sound like a challenge, and I'm so curious about it myself because at the moment, I'm wearing a Whoop, and I love it, and, but that's, to be fair, it also gives me a little anxiety, in the sense of like, oh my gosh, my sleep quality, or you know, I didn't move in a certain way, but thinking about what you just said there, maybe you can give our audience, an understanding of what is measured in a lab, versus what can't be measured with wearables.

Jessica: So, if we talk just about sleep, the gold standard to measure sleep would be to go into a sleep lab, and those exist, and, you would go in, you would spend a night there, you would check in at 8 o'clock, you would have a bunch of sensors, hooked up to your head, hooked up to your body, they're primarily measuring your brain activity.

That's how they're going to define, for the most part, sleep and the sleep stages that you're in. They're also going to collect different breathing patterns, your movement patterns, all these other things that help them score your sleep. So are you asleep? Are you awake? And then if you are asleep, what stage of sleep?

So really they're looking at brain activity. If you notice with your Whoop, your Whoop is not hooked up to your head, it's not measuring brain activity. Instead, what it's doing is it's, it's getting surrogate measures.

So it's using the movement at your wrist. It's using, heart rate and other measures, you know, it's kind of their proprietary algorithm of things they're collecting and putting those together to estimate whether or not you're asleep or awake. And then what sleep stage you're in.

One of the most important things to remember with a lot of these wearables or just these devices, as we move away from the lab, we're moving a lot of times away from the direct measurement and more into an indirect measurement of something. So there's gonna be just inherently more error there. And you could just take a pierced approach and say, oh, that's really bad. I just need the most accurate thing. Okay, great. Then like go to a sleep lab and do that. That's just not really practical and that's pretty expensive. So then we embrace, okay, here's a wearable. It's less accurate but I get to do it every day. I get to track trends. Sometimes it's more important to look at what are our trends over time.

So it doesn't really matter if you actually slept 7 hours and 50 minutes or you slept 8 hours and 2 minutes. If consistently over time the device shows that you've been increasing your sleep and at the same time, you know, you've started a yoga practice, okay, then you can maybe make a correlation there and it can show you some really valuable information from that day to day that you just can't get from going into the lab.

John: I hope that my, my partner will listen to this at one point because she comes from academia. She's a doctor. She's just like, please take this with a grain of salt. You know, it's, it's making estimations right

Jessica: Exactly, exactly. And I, I, I'm married to a doctor as well, so, and he's a EEG specialist, so it always gets him riled up. But, we have to get less focused on what is that exact measurement that it's giving us, but rather what are we trying to do with that measurement? We have to remember, they're just, they're looking at a signal, they're looking at some data point that we've put some value on. But they don't know us necessarily. And how you feel is still really, really important. So I think it's important to, you know, take the data and use it to learn and, check things.

And especially if you are a data motivated person, it's a great way to want to try out things more and get that feedback. But that a lot of times, you know, we can do a sleep diary. That can tell us a lot of information right there, and that's a very low tech way to do it. Now, it may not be the desired way to do it. It may not be as enjoyable, but that's always an option too. So I think we kind of have to think about what is our question.

And the other thing too is, you know, sometimes we just use these, it's more about behavior modification, it's not really even about the data. for example, for me, I, I use an Oura ring. I gotta be honest, I don't look at that data very much. But what I did find really nice is I started looking at the heart rate variability.

And the numbers didn't really mean anything to me, they still don't. but, now I've started to get a sense of what my fluctuations are. when I see that HRV drop, big time. It's a great wake up call. Jess, get your act together. this is now really affecting your health.

And so sometimes it's just those more behavioral things or things that start a conversation. And if that's what the wearable does, then that alone, you know, it got me reading more about HRV. It got me learning about my autonomic nervous system. It got me just thinking about how stress affects my life.

Then maybe that's, that's a big success.

John: Yeah, you nailed it too. Cause my main data point is my HRV and it's really just being like, John, you know what? You probably should just, just do a little breathing exercise right now, my man, like, you know, just slow down, slow down.

It's all right. The work will still be there. Exactly. Awesome.

So. I sense this is a balance now that you're striking in your own life as between academia and industry, you know, tech and business, how do you balance that? And do you feel that more academics should be looking to make that, that bridge? Or is that really actually now your whole purpose of like, Hey, Jess wants to be the bridge to this world.

Balancing Academia and Industry: Why Jessica Left Academia

Jessica: I just don't want any competition. No. I love it. No, no, it's a great question. it's just a weird sweet spot that I found myself in. I really have a strong appreciation for academia and research and, the people that live in that world and spent, a good amount of time in that space and see the really tremendous value it brings. I also saw a lot of the value that got left on the table. And that was one of the reasons that I left academia is I was frustrated that the projects that I wanted to do, the industry partnerships that I saw as opportunities, just didn't fit well into the academic space. And the questions that interested me were not things that would be funded by traditional models like NIH and NSF.

Jessica: And I also saw a lot of companies that just didn't know how to access science. There was this need to connect with good resources, good people, content experts, just people to help walk through what science is and how to do science. And that there were a lot of companies. out there that wanted to do that, but didn't know how, or if they tried to do it with an academic partner, hit a wall because there wasn't this great opportunity just to have a consulting relationship or have a dialogue.

It had to be very discrete projects. The projects had to produce publications. They had to lead the further grants. the sponsor didn't have a lot of say in how the project went. They were giving up a lot of ownership and, and that sort of makes sense for a lot of academic things, but it doesn't work as well in the industry.

And so I saw a lot of opportunity. To do good science, getting left, left on the floor. And so for me, I've just had the happy luck then of being in the position through some different roles that I've had, where I get to work with industry who wants to do the science, I get a chance to kind of understand their business needs, their pressures, their goals, and then.

At the same time, work with the different academic partners who do genuinely want to do that research, but maybe need help figuring out how to do that in a way that aligns with the sponsor's objectives, in a way that, they can still also align with their university's goals and, crossing that communication bridge because sometimes the groups are talking past each otherBecause research is hard and I think a lot of times it's easy to look out, on the industry side and be like, why can't they just do this project for 10, 000 and get it done? and not understanding. No, this takes a lot of time, figuring out research, collecting data in a systematic way, recruiting subjects, getting all the processes right, cleaning all that data, doing the analysis,

It's not a simple process. And helping both sides understand. Where the other is coming from is one of my goals

John: I love it

Quality vs Novelty: Evaluating New Tech

John: Is there any red flags that you could help maybe entrepreneurs listening of how not to get really caught up in just the shiny new thing? Especially right now in the world of AI and how to balance that with good science.

Jessica: Yeah, that's a tough one. I mean, and that is, that is the tension. I'm still learning how to navigate that especially in the startup space, you wanna move fast, break things, try new things, and, and that's great. that is what it's, it's there for. and I don't think it's bad to chase and to try these new things

But I would say my recommendation would be to put some process in place internally where you're checking the quality of what you're doing somehow. And whether that's a gut check, whether that's, some kind of set of tests that you can run and see, hey, when we do this AI, are we getting answers that make sense?

Is there a way that we can create some kind of reference for ourselves that we can go back and check? And that I think can be done internally to allow someone to move fast, and then, at a certain point when I think there's something here when I think it's good enough that I need to be a little more rigorous about how I actually test that out and be a bit skeptical, which is hard.

It's really hard when it's our own product and it's our own baby and we've been getting into it. but, I think remembering that at the end of the day, you can lose customers if it doesn't work. So stepping back at a certain point and then finding those partners, when you're ready to take a breath, ready to take a little bit more time and a little more rigor to then check what you're doing. I think it's finding when to bring in that additional testing at the right time.

Find a way to do that, in conjunction with moving fast in other ways, but recognize that good science isn't always super fast. We can, we can speed up as much as we can, but it can take time. And especially when we're talking about doing with humans, it just gets messy. It's hard to recruit subjects.

It takes time to get them in, time to test and all that. And so I'm having some patience there. To recognize that it is, it is worth the investment, but it just takes a bit longer. I hopefully can help the process. Yeah. I think it's well said. And, and I love how you're kind of saying though, you know, that is, industry is there sometimes to move fast and break things. And it's, it's a symbiotic relationship.

I think that's the important part academics also is a great spot to explore new ideas and, look for some real innovation. Look for some really sophisticated technologies that you're not just going to get off the shelf and some really brilliant minds that you can work with. And so what is your purpose? What is, what is the need? What is the question you're asking? If it's, finding, the foremost expert in, Area X, and having them do some really innovative work, that's, that's a great spot to work with an academic.

And especially if it's something that has a three to five year timeline. And then you can really dig into that. If it's something like, we just want to move fast, see if this works, that's probably something best done in house, or finding the right kind of testing partner that can just move really quickly with you, in a different way, and just turn the data out, and get it back to you.

And, and then there's somewhere in between where it's, you know, We think our device does this, we have a product that does X, but before we go to market, we want to make sure we can back those claims or we want papers to produce that. And then that might be an academic partner.

That might be, a consulting testing house like what we do that can move faster than academia, can turn that around in a more quick way, but still do it in a really rigorous scientific way and get Those results that you need then to publish and to move forward with your marketing

Rimkus - Jess' Consulting Work

John: so that's essentially what Rimkus is doing is offering that that quick or FASTER, I should say, consulting approach that has in house researchers and scientists. Is that kind of a nutshell there?

Jessica: Yeah. So my group in particular, that's what our goal is, is to bridge that gap between academia and industry.we're here to do those, projects that maybe are going to be really time consuming to do in academia and aren't necessarily the sexiest for academia, like doing a validation study you know, it doesn't get them more grants, it doesn't necessarily get them more publications, and it can take a long time to contract and there's just some hurdles with working with academia to do that and finding the right partner where for us, we're here to partner all the way with the company From the start, I mean, it can be scientific literature reviews and just kind of helping, helping the company and understand the science and research in the space for their product and things they should consider as they develop out their product.

Two, you know, coming with a pretty much finished product and saying, Hey, I'm pretty sure this product does X. It's a big part of my, business development strategy that we have publications out there that say these things so that we can, market to these, these groups that are going to ask for, where is your proof?

And we need to do those. Fast. And... can you work with us to get that testing done? Sure, because we can work quickly. We can make customized protocols. We're not worried about setting up, something that leads to more grants for ourselves. Our, our goal is to provide research as a service.

We'll always do it with the highest scientific integrity. We're not going to fabricate any data or anything like that. We're going to operate like any other research lab, but we can do that just in a more efficient way and really be focused on meeting the needs of the client rather than the needs of our research agenda.

Quality Framework - How to Assess Good vs Bad Tech?

John: It's, it's fantastic. And it makes me think too, we just had Sam Robertson on and he was talking a lot about the Quality Framework. Yeah. Yeah. Yeah. And, and I believe you had a huge part in this, in developing these industry standards really that any sports team or any sports organization as a whole can really look to as the technology starts to really ramp up. So can you talk a little bit about the quality framework?

Jessica: Yeah, for sure. and yeah, Sam is fantastic. I'm glad you got to speak with him. I have been really fortunate, to get to work with him, to get to work with several other folks on, on this, as part of the Sport Tech Research Network and as part of this working group that's worked on this quality framework.

I think probably anyone listening to this podcast can say that, that there's a huge, variation in the quality of technology that's out there and we don't have a good toolkit for assessing what makes good or bad technology. And I think the challenge is that it's really in the eye of the beholder.

So if you talk to someone, depending on what they're doing, depending what they're interested in, depending on their background, there's a lot of different points of view on what makes quality technology. For example, so if you talk to someone who is a researcher who's working in academia, even something like myself, it's one of my personal biases is I'm going to think about the accuracy and the reliability of the device. If you talk to someone else, especially like say a sports practitioner, so someone that's working with athletes day to day, they're thinking about usability. if it's great, if it's super accurate, I mean, but if, if the device takes up the whole room.

Doesn't matter. It's not practical. Or if it takes two hours to set up, it's not practical. If it's ugly, the athletes don't want to use it. For them, it's, it's got to be usable first. Doesn't matter.

So there's all these different viewpoints of what's most important. And I think what comes out of that is that they're all important. And if we just focus on one. We lose sight of the bigger picture. And, and there's no perfect device.

There's no perfect technology. It's a trade off. This is the real world. And so, what we tried to do really is put together this framework that would help not tell people how to make decisions and not decide this is good, this is bad, but help themwalk through that process themselves and give them a paint by numbers of these are things to think about.

It's just figuring out what is the right answer to meet your needs. I've been fortunate to be part of the NBA's wearables validation program and gotten to see a lot of technologies there and just recognizing that there's this big, challenge of how to figure out what's good or not. So that is the, the goal of the quality framework.

John: And it, it just seems like it's going to become even more and more important . So thank you so much, Jess, for, for the work you and, and Sam and the rest of that team.

Jessica: Yeah, I think folks, and I also assumed, if a product says it does something, or it's this accurate or whatever, there's some governing body that's certified this, and there's some standards body that's check this, that doesn't exist. It doesn't exist in the sports and human performance space. If it doesn't have a medical claim with it, it's not, it's not hitting these kinds of regulatory pieces. So everything has to be taken just at face value. And, and oftentimes I take whatever manufacturer claims about a device and I knock about 50 percent off.

And that's usually where I find that device actually performing. So I would encourage folks to you know, at least use that framework to start asking questions of, of technology manufacturers, of at least thinking through and empowering themselves to make more considerations about the technology that they're looking at, try to dig up more information. But we also really welcome feedback as people use it and say, hey, this doesn't work, or this doesn't make sense. And that's great feedback if there's, if there's stuff that's not working with it.

John: I love it. That's such practical advice right there. Take a manufacturer's claim and cut it by 50%. That might be the quote of the episode.

Jessica: I hate to say that, but yeah, I think it's just, it's just how it goes.

John: It's so vital. And thank you so much for that, that transparency, because it is so easy for everyday, consumers to just get caught up in the marketing of it all. And it's just , it's so exciting. It's measuring all these cool things. It's giving you all this cool data and feedback. And yeah, really, it's, you know, it's giving a best estimation of where we're at and where the algorithm is at the moment.

Jessica: Exactly. And then I would say. One of the things that I try to at least tell my friends and family when, you know, someone's got questions like, Hey, what's the best device for this? And usually my first question is what do you want to use it for? These devices, they're not a solution unto themselves. They're, they're a tool. they're a hammer or a screwdriver, but they have to have the right problem to fix, you know. So, figuring out what tool you need and why, I think, then, Is really that first step, even before thinking about the quality of the technology,

Because then that sets up, what do I care about? Do I care about its accuracy? Like for me, the Apple Watch is cool, I know the Apple Watch tracks sleep, but then the Apple Watch has to be charged every day, and that's just not practical for my lifestyle.

Just chasing a tool because it's a tool it sets us up for disappointment, whereas if we're a bit more thoughtful and targeted, with why we want one of these devices, then I think we have a better chance of success and enjoyment with it and just getting more value for the time that we ultimately have to spend with it because these aren't just automatic answers.

Looking Forward

John: Do you mind, Jess, entertaining me? As you are really at the forefront of academia and industry and business, where do you see the, the world of sports tech and wearables going is there any problems that you would like to see solved

Jessica: Certainly, I mean, the easy answer is AI is going nowhere. In nowhere, meaning like, it's not going away anytime soon.

John: I was like, interesting. Yeah. Wow. This is controversial.

Jessica: That could have been a hot take. Yeah. It's just gonna fail.

John: No, AI is dumb .

Jessica: It's the worst. Oh. I think, you know, AI's here to stay and it's, it's capabilities and what it can do are gonna just continue to grow. exponentially. I think the challenge will be, and this is outside my realm of expertise, I, I, I can spell AI,  

John: That was amazing. I love that.

Jessica: Oh, but that's about in my, in my expertise level there. So I'm, I'm novice with the rest of them but I, I think recognizing one being cautious. You know, we don't just throw AI at a product and then it's, it works because it's AI. I, I think one of the challenges for all of us is to become more educators. Certainly folks investing in this space, doing work in this space, like really starting to understand better what AI can do, what its limitations are. What are the questions that we should be asking about how it, how it develops and, and what it's, it's doing with the product.

But I think we also just have to be careful not to be blinded by it as just some magic bullet that should just, fix everything, and I think there's a lot of ethical questions that come with AI, and so trying to be a bit more thoughtful and pragmatic about that is going to be one of the challenges for the field moving forward and just how, you know, building off of lots of athletes data and what that means for developing out a product and what are the legal implications of that and, and what are the ethical ways to do that?

And also, what are the limitations of that, you know, if all the data is built, if it's all fed on, male, 22 year olds, perfectly able bodied athletes, that still doesn't mean it works for a 45 year old female who's had a stroke. And so, being smart about what, what these models can give us and where they can go.

I think at the end of the day, still, I just see a need to make people educated about science and what science is and how to ask good questions. And not take things at face value, I think will always be one of the challenges in this field because there's always the next sexy thing.

And so figuring out how to be a bit skeptical, you know, skeptically optimistic and not, not get blinded by the next hottest thing.

How to Be a Better Human Who Does Science

John: Well, Jess, as we start to wrap up here, I would love if you could leave our audience with any of the rules that you live by as someone who cares so deeply about the human body, about science, about math, about engineering, What has really fueled you along this way to become a better human who does science?

Jessica: Oh, wow. That's a, that's a good one. I will say, I think for me, what I try to remember at the end of the day is we're all just people trying to do our best and that these are really tough problems.

We're all trying to get home to our families. We're all, we're trying to make a product, make some money, find a way through, do some good. And, I think the more we can recognize that, we can have really productive conversations, even when we talk about just bridging academia and industry.

Instead of focusing on our differences, recognize we all just want to do, good science and good work. We may have different perspectives of how that gets done, and we have different pressures on us that come from our institutions and come from our bosses and come from, our incentive system, but we're all trying to get to the same goal.

And so the more that we can focus on that and just step back and remember that human element when we are developing these devices, working through these business transactions and figuring out a path forward, I think makes it just that much easier to do the work and actually find productive solutions.

John: That's awesome. You're a rock star. Is there any place online, where people could interact with you, follow more of your work, maybe read some of your papers?

Jessica:  Yeah, so you can follow me on LinkedIn. It's just Jessica Zendler on LinkedIn. Try to post stuff there as it comes up. You can check out our our website, which is and go to our sports science consulting page and learn more about what we do and you can contact me through that as well. I'm otherwise not very excited on the socials because I have a three year old to chase most of the time, but those are those are good ways to find me.

John: Mm hmm. Awesome. The three year old to chase the the real science coming. Very cool. Well, just thank you again so much for your time today, and I wish you a wonderful continued successful career and I hope to talk to you again soon.

Jessica: Thanks so much. This was great.

John: Thanks for listening to Human Science. If you enjoyed this episode and you'd like to help support the podcast, please share it with others or rate and review it. All the show notes and links can be found over at labfront. com slash human science.

Quotations Icon
Hardvard University Logo Image
“Now, more than ever, we need tools like Labfront that can help researchers take their research and data collection virtual.”
Dr. Gloria Yeh
Dr. Gloria Yeh, MD
Associate Professor of Medicine, Harvard Medical School
Hardvard University Logo Image
Quotations Icon
Bar Ilan University logo image
“The Labfront team was extremely responsive to our requests and made every effort to accommodate our unique needs.”
Yogev Kivity image
Yogev Kivity, Ph.D.
Senior Lecturer of Psychology, Bar Ilan University
Bar Ilan University logo image
Quotations Icon
Oklahoma State University logo
"Labfront's analysts understood our needs as researchers and saved us weeks of work when they prepared our datasets for analysis."
Bryan Edwards headshot
Bryan Edwards, Professor, Joe Synar Chair
Management Department, Oklahoma State University
Oklahoma State University logo
Quotations Icon
Clemson University Logo Image
“The physiological data really helped provide deeper insights.  We recommend Labfront and will definitely be using it again!”
Dr. Kristin Scott image
Dr. Kristin Scott, PhD
Professor of Business, Clemson University
Clemson University Logo Image
arrow icon