IntelinAir's AI-Driven Image Analysis is Saving Crops - Down on the Farm today but tomorrow.....

Share:

Listens: 0

MoneyBall Medicine

Miscellaneous


This week on MoneyBall Medicine, Harry takes a field trip (literally!) into farming and agriculture. His guests are Al Eisaian co-founder and CEO of crop intelligence IntelinAir, and the company’s director of machine learning, Jennifer Hobbs. Intelinair’s AGMRI platform uses customized computer vision and deep learning algorithms to sift through terabytes of aerial image data, to help farmers identify problems like weeds or pests that can go undetected from the ground. The parallels to the digital transformation in healthcare aren't hard to spot.Harry has talked with scores of guests about advanced computer science techniques like neural networks, computer vision, and machine learning and how they’re changing the way healthcare providers can find patterns in genomic data or radiology images. But the fact is, these same techniques are being used to generate new kinds of actionable insights in many other areas, including agriculture. In fact, today’s farmers are almost overwhelmed by the volume of imaging available to them from drones, airborne cameras, and satellites. IntelinAir uses AI techniques to spot patterns and trends in these images, in a bid to help farmers address problems before they get out of hand, while making smarter use of fertilizers and pesticides.Which sounds a lot like using digital health data to keep patients healthier while making smarter use of pharmaceuticals. So don’t be surprised if ag tech companies end up having a thing or two to teach the digital health industry.You can find more details about this episode, as well as the entire run of MoneyBall Medicine's 50+ episodes, at https://glorikian.com/moneyball-medicine-podcast/Please rate and review MoneyBall Medicine on Apple Podcasts! Here's how to do that from an iPhone, iPad, or iPod touch:• Launch the “Podcasts” app on your device. If you can’t find this app, swipe all the way to the left on your home screen until you’re on the Search page. Tap the search field at the top and type in “Podcasts.” Apple’s Podcasts app should show up in the search results.• Tap the Podcasts app icon, and after it opens, tap the Search field at the top, or the little magnifying glass icon in the lower right corner.• Type MoneyBall Medicine into the search field and press the Search button.• In the search results, click on the MoneyBall Medicine logo.• On the next page, scroll down until you see the Ratings & Reviews section. Below that, you’ll see five purple stars.• Tap the stars to rate the show.• Scroll down a little farther. You’ll see a purple link saying “Write a Review.”• On the next screen, you’ll see the stars again. You can tap them to leave a rating if you haven’t already.• In the Title field, type a summary for your review.• In the Review field, type your review.• When you’re finished, click Send.• That’s it, you’re done. Thanks!TRANSCRIPTHarry Glorikian: We talk a lot on the show about advanced computer science techniques like neural networks, computer vision, and machine learning and how they’re changing the way healthcare providers can find patterns in genomic data or radiology images. But the fact is, these same techniques are being used to generate new kinds of actionable insights in many other areas. And today I thought it would be a fun exercise to take a field trip…literally!... into farming and agriculture. Just like doctors, today’s farmers are almost overwhelmed by the volume of imaging that’s now available to them. In the clinic, these images come from MRI machines and other types of scanners. On the farm, they come from drones, airborne cameras, and satellites. And in both cases, if you can use AI techniques to spot patterns and trends in the images, you’re then in a position to address problems before they get out of hand.We’re about to meet two executives from IntelinAir, an ag-tech startup that offers a so-called “crop intelligence platform” called AGMRI. It consists of customized computer vision and deep learning algorithms that sift through terabytes of aerial imaging to help farmers identify problems that can be hard to spot from the ground. We’re talking about things like weed infestations, nutrient or water deficiencies, weather damage, insect damage, fungal damage, and poor tillage or drainage patterns. The company flies over client’s fields up to 13 times per season, which means they can provide a picture of the evolving health of the crops in those fields. Ultimately the goal is to help farmers increase yields while making smarter use of fertilizers and pesticides. Which sounds a lot like keeping patients healthier while making smarter use of pharmaceuticals. But as we’ll hear, the flood of new data that’s available to farmers is even bigger in some ways than that available to doctors. So it won’t be surprising to me if ag tech companies end up having a thing or two to teach the digital health industry. So without further ado, let’s meet IntelinAir’s co-founder and CEO Al Eisaian, and its director of machine learning, Jennifer Hobbs.Harry Glorikian: Al, Jennifer, welcome to the show.Al Eisaian: Hey, thanks for having us.Harry Glorikian: No, it's great to have you guys on the show. And I know that I'm sort of slightly stepping out of the bounds of what is, what would be looked at is as traditional health care. But I thought this episode would be really interesting to go into from from two sides. One is, obviously, you guys are in the agricultural space and agriculture is, as far as I'm concerned, paramount to health. As a matter of fact, it's probably a better way to keep people healthy, if they just eat better. And the other side of it is the image analytics I've always looked at, the technology doesn't necessarily care what it ingests. It has the ability to see all sorts of features, whether that's a crop or an insect or an image on a radiology scan or, you know, a pathology slide. There's all the... I think the technology can blur where it is and how it's applied. But before we get started with that is, Al, tell us the origin story. How did this how did IntelinAir get started. How did you end up doing this?Al Eisaian: Sure. It's an interesting story because I was invited to talk to a lot of PhDs and graduate students about entrepreneurship back in 2014. So I was invited to go to UIUC and give a talk. And so I did. And as you know... I landed in Chicago and then you drive through three hours of corn and soybean fields. And it was interesting. And I was like, I didn't think much of it. But during the few days that I was at UIUC, they took me through all the very impressive buildings and very impressive labs that they had. So I had no idea that, you know, Ray Ozzie went there. I had no idea that Marc Andreessen, graduated from there. And so there were all these buildings that I was looking at. And then they took me to the to the Ag Department. So I found out very quickly that there's, that UIUC was one of the epicenters of data science. Fei-fei Li went there. So that this whole stuff with deep, deep learning and imagenet and all that stuff actually had its origin there. And then it went to Princeton and then Stanford.Al Eisaian: So I knew I knew nothing about agriculture. But I had just recently sold my company and I was thinking like, where do I spend the next decade of my life or more? And I wanted to do something that had global impact. And I've been a little bit of a sustainability nerd for a long time. And and I sort of put two and two together. After about a year of doing research, I said, yeah, this is an area that I can bring my passion for big data and data analytics and agriculture and try to make something that would be more than just about making money. And then so that's how IntelinAir was born. My co-founder is a professor at UIUC, is a very storied professor, Professor Hovakimyan, and and so we kind of put our heads together and we said this is what we can do. And so when you look at even the name, IntelinAir, stands for intelligence in air. So it's really around observation, it's really about, if you want to improve anything you have to measure it, you have to measure it frequently, and you have to validate that measurement, and then actually put science to work. So that's the origin story of of IntelinAir. Harry Glorikian: So at the highest level, what's the value proposition here? Are we trying to make farming more efficient, more sustainable, more productive? All of the above? I mean, how are you guys thinking about this?Al Eisaian: Yeah, so and I'm big on names that kind of actually describe what we do. So the name of the product and the service is called AGMRI. So a lot of analogs from health care. So AG stands for agriculture and MRI, the way we describe it is Measurable, Reliable Intelligence. So if you can agree to the thesis that if you want to improve something, you have to frequently measure and see what the behavior of that thing that you're trying to improve is observed properly, accurately, and then you put the types of, you make the types of decisions that allows you to introduce and make those improvements. So at the highest level, it's a comprehensive crop performance intelligence platform. And when I say comprehensive, I mean full scope and I mean full season. So when I say full scope, it's everything, it's not just imagery, it's soil information, it's weather information, it's everything that is absolutely essential for growing crops. It's the farmer practices. It's getting the IoT information off the equipment. It's all of those things combined. It's sort of what we call there is our gigantic data ingestion challenge. Right? We're talking about petabytes of data. The value proposition really is around timely, actionable insights that allows not only the farmer, but the whole ecosystem of farming to benefit and make better decisions. So it's something that provides value to the entire value chain.Harry Glorikian: So what is the special sauce of this? What can you do with high resolution field images that that no one else can do, or what is the computer vision...and I feel like Jennifer's about to jump in here any second now and tell me like...but what is that special sauce that you guys have brought to the table? Because I have a feeling here there's multiple layers of information that are getting stacked on top of each other to sort of, I want to say, tell a story of what's happening. Tell me how you guys would describe this. And remember, there are probably no farmers that are listening to this podcast. So if you could sort of put it into context, because at some point I can almost see that these, this approach has a superimposition onto different parts of health care when we look at it.Al Eisaian: Sure. Jennifer, do you want to take the lead and I'll chime in as necessary?Jennifer Hobbs: Absolutely. So much like health care, our data is truly huge. A lot of people talk about big data, but our data really is big and it's big in a lot of different ways. So, as Al mentioned, we have lots of different channels, right? We have RGB, we have near-infrared. We also have things like thermal. We have the soil information. We have the topo maps. We have all of this information that we can incorporate into the models. Then additionally, we have it at high res. So there's a lot of things and a lot of work, and computer vision in agriculture in the past has been limited to publicly available, low-resolution satellite data. And it's great that it's out there and it's free and it covers lots of different areas. But there's only so much you can see at that resolution. Where at the resolution we're at, we're able to see the crops emerge weeks before you can see it in satellite, you can see the different stressors within the field. You can see individual weeds and weed clusters. And that really, that level of size makes the data richer, allows us to do earlier and better prediction across all of the different tasks that we're interested in. And then because we fly around 13 times a season, we have a continually evolving view of the field. So from a single snapshot, at any point in time, you can do prediction decently well. It's pretty hard to do prediction, but with this temporal element, now all of a sudden you have that story, you have that evolving health of the field. And we can do, by using multiple flights, we can do both better detection as well as better prediction. And that's very exciting. So it's big data on a lot of different fronts. And because we have so much of it, we can then turn around and use a lot of the deep learning methods out there that help us deliver these models across a variety of tasks, a variety of different lighting conditions, domains, and really scale up quickly and and address the issues that are most pressing to the farmers.Al Eisaian: So, and just to give you an indication on scale, when we talk about resolution, the free satellite is like 10-meter by 10-meter squares. We're talking about 8- to 10-centimeter squares.Harry Glorikian: Yeah, like you can actually see the bug on the leaf.Al Eisaian: Not quite, not quite. We need to get down to like maybe a couple of centimeters to see the bug on the leaf. But we can see a lot at 8 to 10 centimeters. And that's not far away, right? I mean, a couple of centimeters at scale. You can do it with drones today. The big question is, again, the volume of data. Because every time you come down, you know, it just explodes.Harry Glorikian: So every one of these companies out there is obviously trying to convince... It was funny because I was reading, I haven't gotten through it, but a Technology Review piece that was just written about ag tech. But everybody's trying to convince, "You need our technology because it improves yield," or some other aspect. And so how do you... What's the pitch? And how do you win a farmer's trust, right, to be part of this process that they're doingAl Eisaian: You know, I think, again, back to how the company was built. I mean, way before we decided to really focus on just the ag sector, I personally had, I visited like a hundred farmers. And then my team has probably visited hundreds of farmers. A lot of those visits were actually in their farms. And then a lot more was done at these shows, Farm Progress and other shows. We would just engage people in conversation and ask them, What are the issues that they're having? How do they do their days work? And we had a lot of a ride along. Literally we lived with farmers, just to try to understand what are the...it's not just technology for technology's sake. In our case, was the question was, is it going to be used? Is it going to be used? And farmers basically were not interested in just getting bunch of images. They were like, "Just tell me what my problems are. Tell me Soon enough that I can go address it. And if you do that, then we'll engage." So initially, the first couple of years we were just iterating with the farmers, directly with the farmers. The last couple of years, what we've done is, obviously we think that we're kind of getting closer and closer, and I think we are there now, where this technology can be distributed by through our partners. So large companies that have tens of thousands of farmers that they can serve with our technology. So the go-to-market with farming is quite a challenge. And that's thing that I completely, completely underestimated. I thought farming was simple. Farming is really, really complex. And I was like, this is my fifth company I've started in two decades. And I can say by probably an order of magnitude, this has been the hardest because there's so many elements, especially outdoor farming. I think indoor farming, vertical farming, there's a lot of the elements that you can control. So indoor farming is a lot simpler. And if I were to... Maybe I shouldn't say this, but if I were to start it all over again, I would go after indoor farming. I wouldn't do outdoor farming. But my love of sustainability and the planet and stuff like that would still pull me to the outdoor broad acre.Harry Glorikian: So I'll be honest with you, the first time I, and this was a long time ago, the first time I went to EPCOT Center, and I went through their hydroponic area and sustainable farming and then the aquaponics area, I was like, "I really want to start a business like that." And I swear to you, every time I see an article, I get sucked into it because I think this is going to be the next big opportunity, although making money there is really hard.Al Eisaian: Exactly. That's because you don't know all the details. That's the curse of entrepreneurship. It looks really good. We're like, oh, my God, you see dollar signs. And you see your name in the headlines. But then you get engaged and oh, my god, it's like a can of worms after can of worms after can of worms.Harry Glorikian: I know it, Al. I mean, it's funny because every time I get involved in something I don't know every detail and then, but once you're in it you've got to get out of it. And so you've got to dig your way out of the hole, no matter what. Right? Otherwise it fails and that's not acceptable.Harry Glorikian: So, Jennifer, when you guys are doing this stuff, how much of this are you having to, you know, I keep thinking about my world where we have images, we have classification of those images or diagnosis of those, and then we train the system over and over and over again. And the bigger the data sets, the better. You guys are working not just with one image, but multispectral levels of imagery. And so how are you approaching this from, I guess, the machine learning perspective. I don't even know all the techniques that you guys are using, but are you taking stuff that's off the shelf? Are you having to design it from scratch? Is there some combination? But what walk us through how you look at that area and where you see that technology going next.Jennifer Hobbs: Sure. Well, we do a little bit of both and, interestingly, the medical imaging space is probably the space that is most similar to what we're doing as far as like techniques used, because of the size of each individual image, the number of images. So we do steal a lot from the cutting edge work that's being done in the medical imaging space. But one of the things I've always, when doing R&D in an academic setting, in an industrial setting, so I did my PhD in physics. So I have an academic background, but when we're doing R&D in an industrial setting, I still believe, I sort of believe you can do the research in sort of an iterative, agile-like fashion. So a lot of times we will take essentially a baseline model or whatever is sort of standard in the field. What's the what's the simplest thing that we think can work that's going to get us some initial results? And we'll try it and we'll see how it works and then we can decide how to go from there. So if we're talking about a detection or segmentation task, if I take one image and I do the simplest thing possible, which is just maybe stack all of the channels together, how well do I do? And then when I start to look at the failure cases, I can sort of start to see, well, are the mistakes that it's missing...would it do better if if I could give it more historical information? Ok, well, if I want to fuse the temporal element, how do I, how might I construct the network so that I can bring in this additional temporal time element? Sometimes you can do it as simply as just stacking more images. Sometimes you need something temporal in nature, RNN, LSTM type approach. In this work that we did that was just accepted at AAAI, we used a convolutional, we use a U-net to sort of get the features and then a convolutional LSTM to incorporate the temporal elements. Other times, maybe it's not so much the temporal element, it's I need to get more context. I need to see more of the field with a single glance so we can use some of the dilated convolution techniques out there. So a lot of it is sort of starting simple, seeing what works, seeing where things are still lacking, and then identifying the different routes, different ways where we can fuse more information into the system, more and more and more, until we kind of get to a level that we're happy with.Harry Glorikian: So I'm trying to get again to the secret sauce. Is the image-gathering part of the process becoming sort of commoditized over time, by the drone technology or different methodologies of capturing that? Or is there uniqueness in the capture part? Or is the uniqueness in the data analytics side of it?Jennifer Hobbs: I think it's both. I mean, the imagery itself, I think we're currently, like one of our strengths is the temporal element. But assuming you have the data a lot of times with data science and machine learning, a lot of the times the secret sauce is actually asking the right question, is knowing what it is you're looking for or what problem you're actually trying to solve. Sometimes we can get, it's easy to get kind of caught up and say, well, "I want to do everything all at once" or "I want to detect this." And maybe you actually don't care about detecting this. What you really want is to solve a downstream process. And so a lot of times it's still understanding what the farmer needs, what they want, what his end acceptance criteria might be. And then and then going after that. Because in truth, for somebody, let's say in an academic lab, you never want to say, "Well, I actually don't care if my model is not 100 percent, I want the best outcome possible." And certainly we do, but we also have to look at it in terms of what's performance versus cost, performance versus time. If I can make a model that runs three times faster and is only two percent lower in performance, well, now the cost is is a lot less. And so there's that business criteria kind of on top of the actual machine learning. And so I think a lot of that understanding how this is going to be used, how this is going to deliver value to the customer, is also one of the things that we do really well.Al Eisaian: So as far as far as the capture side, yeah, we've been the main culprit of commoditizing it over the last five years. I was on a panel, I think, five years ago, and I said, "You know what, I should be paying a penny per acre per capture." And this is when people were charging $4 an acre for one capture, and that was at 28-centimeter resolution and I wanted it at 5-centimeter resolution. We didn't quite get to 5-centimeter resolution. And we didn't quite get to a penny per acre per capture. But we're pretty damn close. And I think we're going to get closer over the next three, four or five years. So the more we can automate the capture... So right now, the vast majority of our capture is through manned airplane with very, very high powered, expensive sensors in the belly of the airplane. These are not your Canon cameras hanging out the window. These are these are like a quarter-million-dollar sensors that you can fly 120 knots, cover 150,000 acres a day and capture that at 8- to 10-centimeter resolution, pretty accurately. It's almost like, I think it was probably five years ago it was military technology. Now it's commercial. And we're hoping that more military technology will become more commercial. So I think that's commoditizing. And then I think two years ago, October, the US government relaxed satellite imagery for commercial applications from 50 centimeters per pixel to 25 centimeters per pixel. So you can see that from a standpoint of purely ground spatial resolution, that is happening. Right. I mean, our government probably has technology at 5 centimeters, 10 centimeter resolution today, but not open for commercial. That's going to change over the next three to five years. I'm willing to bet good money on that, that it will. So now, you still have the thermal problem, especially for the agriculture sector. But imagine that you have satellite imagery at 5-centimeter ground resolution. That becomes pretty powerful. Right. And then as far as commoditization, that data should be, I hope, should continue to come down in pricing so that it's available and it's ubiquitous.Al Eisaian: And then so then back to your question of what is the real differentiator and secret sauce? It's the analysis. It's the A.I. That's one area that is going to continue to be a bottleneck and continue to be more of a bottleneck in agriculture, because the vast majority of data scientists and machine learning PhDs are not smart enough yet, as Jennifer is, to actually go to agriculture. Everybody is doing this. We have an overabundance of people that are doing self-driving cars, overabundance of people that want to go into the health care field. But we have the really smart people that come to agriculture, like Jennifer.Harry Glorikian: So well, I could tell you, like, we definitely don't have enough people that go to health care. I can I can attest to that. I mean, I keep trying to lure people and say, forget this whole Facebook junk. What are you going to do there? Come to health care so that you can change people's livesJennifer Hobbs: The one thing I'll say, the difference with, there are a lot of things that we have in common with health care, but one of the differences is just the scope of the data. So the data itself is large, but we collect all of this raw data. But what really gives it value is when we can extract information out of it through these different models. And certainly to get started, at least you need annotations and you need good ground truthing and annotations. And that's another thing where we have people skilled in that area who can generate these annotations for us. But I think one of the exciting areas in this field, and really an area that's sort of hamstringing the CV and ag community out there, is if we have petabytes of unlabeled data and only gigabytes of annotations, how do we narrow that gap? How do we use all of the unannotated data out there? Because in truth, we're never going to get all of it. You can't annotate the entire world every single day. So we need to use what we have to also further maximize the unlabeled data that's out there. And I think that's a really exciting area that that we're excited to go after. And I think will be a real game changer on this front as well.Harry Glorikian: I'm obviously thinking on my feet here, but I'm trying to figure out like, OK, but in our world, like I can for the most part, my predictive power, I mean, it's getting better and better over time, but I don't have as many elements per se affecting, like the weather, the water, the tractor that came, there's a lot of things that you guys are trying to adapt for, so it's sort of exciting, like if you guys actually figure out how to take all these inputs and really predict better, I almost want to say, like I want that prediction model and start to think about superimposing it into my world, because I don't think we have as many variables. I know somebody somebody is going to make a comment that listens to this, "Harry, you don't know what you're talking about." But I do believe that you guys are dealing with things with many more unknowns than maybe we are in the health care world. So how well is the predictive nature of what you're doing to let someone know something before it happens. To say "You may want to go and look over here" or "By the way, historically, we've noticed that if you do this, you got a better outcome." Are you guys at that level of being able to make those recommendations to farmers?Jennifer Hobbs: That was the really exciting kind of result that came out of Safa, she was a PhD intern with us last summer, this work that that was accepted at AAAI that she did. So we were doing nutrient deficiency detection from the air. Can we find areas that are under stress? And this is really important because once stress sets in, you can't fix that. You can just sort of stop it. So you want to know as soon as possible that this area is lacking nutrients, you can go out and spray. At the same time, it has an environmental element to it because the more targeted and precise you can apply the chemicals, the less excess chemicals ends up in the water table, for example. So if we can, One, we want to detect it. But let's say detection for this task with our data, you can try a bunch of different things. And it hovers around an IOU score of, let's say, 0.4, depending on kind of where and what time of the season. And we did a lot of things from a single image and it was hard to kind of get it above that. When we started, including the temporal element -- what if we include the previous two flights? All of a sudden that IOU for detection shot up to, I believe, close to 0.6. And so then our next immediate question was, well, if I can now detect really well, can I anticipate this one, two flights out? And we saw that again, using this flight over flight information, we were able to predict these regions of stress two flights into the future better than we were able to detect from a single image initially. So sort of seeing how the field is changing week over week gives the model enough information to say not only is it here, but this is where it's going. And that's extremely powerful and has a lot of value to to the farmers.Harry Glorikian: So it's similar to, now I forgot her name, but she's over here at MIT where she's taken historical MR images in and been able to find features that predict a tumor advancing into the future before a human being can actually really see those features. And so that I guess that's my next question, is, what does the system see that a human can't see? I'm sure it's a lot, but work with me here.Jennifer Hobbs: The answer right now, today, is we don't know. Right. The sort of the trust of these deep learning models, unlike the past machine learning models, where they were based on handcrafted features and you could say, oh, it made this decision because of these features. There's a lot of things we can do to try to understand what the model is looking at. But it's not it's not as straightforward in the past. So interpretability is obviously a huge area of the machine learning community right now and one I think will continue to to grow, because people want to know, what is it, what is it looking at, what is it seeing? And there are some additional things we can do in our field, kind like medical as well, where you say, well, in addition to knowing what the model is looking at, I want to know I actually want to know causal effects. And then that's a whole 'nother area as well that's, I think, really kind of catching catching steam. So, yeah, the answer is we don't know. We can hypothesize and say, well, you know, it's doing things like, by the way it's constructing its features it's a little bit more robust to lighting changes. So it's able to control for this and that and actually see this sort of evolution. But we don't know that. That's sort of our best hunch at this point. But that that's really sort of all it is, is a hunch.Harry Glorikian: I can see how over time like this is, you know, it's going to provide more accurate, actionable information about crops. But let's say you sign somebody up and they start their first passes. When do they start seeing the benefit of the service?Al Eisaian: It's almost immediate, right? Because, so, A), they don't have to go through a bunch of different point solutions to kind of try to keep an eye on things, I mean, we're talking about vast areas, right? I mean, these are like multi-thousand-acre farms. And, you know, in the US, it's not really contiguous farms. You might have a couple of plots over here, a couple of fields over here and then several fields 10 miles away because of how inheritance has worked out and because of subsidies and whatever. And so the fact that you can, in the winter or if you have inclement weather outside, you can actually sit in front of your computer or on your iPhone and keep an eye on your domain, if you will, and just sort of like flipping through the stuff, that's immediate value and you don't necessarily need to have every flight to happen.Al Eisaian: I mean, again, those flights are again... It's a continuous system. And then you've got 13 high resolution captures. Because there's stuff in the in the system already. So there's a bunch of stuff like, you can look at from your last season, that allows you to make decisions for this season that you're in. And so the value is almost immediate.Al Eisaian: And then I also want to emphasize a couple more things. One, it's a decision support system for the farmer as far as which fields do I go to? So we do the prioritization. We say here's the severe areas by field, by percentage, so that you know exactly. And then also we pinpoint where the problem is. So they don't just go to the field, they actually go to the, they're staring at the problem. Harry Glorikian: That's interesting. It's exactly like what I was thinking about, guys, because, you know, they've developed a system that can show a cranial bleed and it'll move it up on what a radiologist should look at. So there's so many similarities of these technologies. It's just looking at different spaces.Al Eisaian: Wee flipped 80-20 or maybe 90-10, which is, instead of 80 percent of time guessing or trying to figure out where your problems are and 20 percent of time you're addressing your problems, we flip it, which is we take care of that. So you spend, I mean we actually alert you, you don't even, I would say 5-95 right now. We tell you where the exact problems are. So 95 percent of th time you are addressing issues. And then the second thing with regards to the collaboration that happens between farmer and all of the people that are around the farmer, the retailer, the sprayer company, the irrigation company, the seed company, if they give access to their fields, then they can actually do it remotely. So we're talking now tele-agronomy.Harry Glorikian: That was going to be one of my next things is how do you, how does this dovetail with all this what, what is it, precision ag technology that's out there? And how do you, are you working with those companies to integrate this information?Al Eisaian: Yeah, the way that we have built the product and the insights it can we can populate, we have like API systems with John Deere and FieldView Climate and a bunch, a whole host of others. We believe that that insights and data should be democratized and free. Not free necessarily that we don't want to make money, but from a standpoint of where you need to consume it. So it could be mobile and you can consume it on our app. On AGMRI. It could be a widget inside of a John Deere operations center. It could be a widget inside of Climate FieldView. The main issue is what is the preference of the farmer? Wherever that they are consuming their stuff and they want to get these insights, we're happy to kind of pipe it over there. So these collaborations, as I sort of think about the future, it's better data. I mean, I think Jennifer hit it right on the on the nail, which is you got you got to increase the trust in that, that trust translates to lower costs, higher yield, less headache, better lifestyle. Because farmers in planting phase all the way to harvest, planning for next year, it's a pretty anxious time, right? So imagine that actually this is also a lifestyle improvement, because now you feel a lot more in control, versus guessing, versus somebody else coming and telling you stuff, versus, there's always some sort of disease that's a runaway versus it's surprising you. Wouldn't you want to know, like, if it's in the next county and if you can take some preventive measures, you can be in a better situation. So the old saying is an ounce of prevention is worth a pound of cure. Unfortunately, people don't pay for prevention. They pay for a cure. And I think that's where that's where I think that whole mindset is shifting.Harry Glorikian: It's interesting because we are trying to shift health care away from only treating somebody when they're sick and actually managing them when to keep them healthy is more valuable. So. I mean, I have two sorts of questions. How do you look at yourselves versus other people in the field that are making these, making a lot of claims, because I have seen things around carbon sequestration and so forth. And then sort of a dovetailing question is, I feel like there's so much more that you could do with this rather than, I know the application that you're looking at, but the possibilities around commodities and all those sorts of. I'm a capitalist, I can't help myself. I'm thinking about, you know, but there are so many other areas. What could or those other areas be that this is applicable to? And again, how how do you compare to other people in the field. Not trying to pull anybody down or raise anybody up, but just as a sort of a thought process.Al Eisaian: We're the best and everybody else is just so-so. Harry Glorikian: [Laughs] I should have asked Jennifer that question, Al.Al Eisaian: Not from not from the boastful entrepreneur. Very fair question. So I think so. I mean, it's really a question of approach. From day one, we've invested in data science and and cutting edge science. And literally we're starting to come to market this year, five years after starting the company. This is the year that we're going to actually spend money on marketing and sales. Why? Because it's damn hard, I mean, Jennifer, just explained. It's really, really hard to get to a level that you can with a straight face tell people that this is not vaporware, that this actually works.Al Eisaian: In comparison to others. You know, look, carbon sequestration, at the core of it, what does it entail? You have to measure so you have to trust the measurements that you're making pretty certain practices. You have to verify. And you have to certify. And then you have to pay people. The certification process, the verification process is the hardest and who has the most granular information in the world? Nobody has invested as much money as we have in really, really granular, really, really high cadence, like 13 times a season. But then there's a bunch of other things that is like every five minutes. Weather. Precipitation. And so when you look at it that way, you say, OK, if you're thinking about carbon sequestration, if you're thinking about actually helping the climate situation. Agriculture and forestry, agriculture is 25 percent of problem and also 25 percent of the solution. And forestry is 17 percent, 17, 18 percent, depending on whose numbers you're talking about. If you take those two together, then everybody should be talking to IntelinAir about our technology. Everybody is interested. And then, as I said, we're just starting to kind of talk about and start boasting about our stuff. But do you think about FedEx spending $200 million buying carbon offsets in the future? And then who's going to measure it? Who is going to verify it? Who is going to certify it? Who is going to make sure that that farmer gets paid? These are challenging things that have to be solved. But at the core of it, we've got a solution. Now, somebody else can take that solution, or maybe we will do it, and then monetize it, but ultimately it's not through handwaving and PowerPoint presentations, it's really about science. You have to measure it, right. You have to say "I actually sequestered x many gigatons of carbon. And here's the measurement before. Here's the measurement after." Right. And here's what the farmer did. And he deserves this check, OK? And and so I think on that front, we like our chances.Al Eisaian: With regards to some other people. I mean, look, some people look at this thing primarily as imagery business. We've never looked at it as an imagery business. We've always looked at it as a crop intelligence business, what you're trying to do is you're trying to use science and whatever and the highest fidelity data that you can get your hands on to provide real solutions, to provide real, take it to the bank ROIs to the farmer, but not only to the farmer, but also everybody else that's involved. You mentioned commodity trading. Would it behoove the people that provide working capital to farmers to say, hey, you know, it would be good -- it's sort of like the Progressive Insurance thing. If you say yes to this gadget inside of your car where I can measure how you're driving, I'm willing to give you a 20 percent discount. We're going towards that. So the most advanced, we are talking to Wells Fargo and other companies. They're starting to think that, because, that a big asset. I mean, if you're giving working capital to people that are not data driven, that might cost them more. Al Eisaian: Insurance. You know, one of the one of the things that I learned in year two was there was a massive weather problem in Iowa and I went to this farmer's shop and there was like five to five drones, different types of drones. And I said, what are these drones for? He goes, oh, yeah, when when weather hits, my brother takes that one, I take that one. My cousin takes that one or two field hands take these two. And we all jump into our trucks and we we drive out to the fields. And for the whole day we survey, we fly the drone, take imagery, bring it back, take it out, put it into the system. And think about that level of detail that they have to go through just to negotiate with the insurance adjuster what they need to get paid on the crop insurance front. That's one way of doing it. Now imagine the way that we can do it, which is both the insurance provider and the farmer are subscribers to our system, we actually have algorithms that tell you exactly by percentage what the damage was. So there is no pissing contest between, oh, look at my thing, look at my video, look at my this.Harry Glorikian: So what I find is interesting is I actually I was talking to somebody at another venture fund earlier today, and I was I was saying to them, I'm like, know, once you deify something, the potential business model shifts are phenomenal. You just have to imagine them. And now you've got to bring other people along with you, which is half the problem.Al Eisaian: I want to do it for the farmers, right. I mean, some farmers say, what are you gonna do with my data? I go, you know what? I want to pay you for your data. And they're like, what? I go, Yeah, you know, if you and I get into business where your data now matters because you're running your farm better, you should get a better rate. You should get a better insurance rate. You should get better yield. You should get better. Everything, right. That data has value and I want to pay you.Jennifer Hobbs: You can turn it around, you can use it to create better seeds, better products, because you could do a lot of, there's obviously a ton of research that's done in the labs, around the farms, that are being used to develop these other products. But then they have to go out and live in the real world. And the question is, well, how well is this product going to work on my field? Given all of things? You know, what if they didn't have my type of soil or my type of weather. What if it rains more or less the season? And now you have, you know, acres and acres and acres. You have entire states of data that you can actually look to see how well did these different combinations perform. More than just you know, "Here is a really confined experiment that was run," how did it actually fare out in the real world? Because maybe it's also very effective, but it has to be used a certain way. You find that people aren't using in a certain way. Well, if I make these changes, can I get better yield? And I think that's where having the data coming in just opens up so many different possibilities.Al Eisaian: There's one more thing to add just relevant to this thing. Imagine that USDA has thousands of people that call and get survey data. They call a farmer that has let's say... This is a case in point, like a real, real live thing. The farmer has 43 fields. He reports on one field and extrapolates. And that's how USDA, for the most part, gets their estimations. They use some satellite stuff as well, but you can imagine? It's $8 billion a year of of guarantees. And I don't know how much, but there's I'm sure there's hundreds of millions of dollars of fraud that happens where the farmer reports something that didn't really happen. And then now they have to get the federal farm insurance. So what I'm saying is that, you know, the US government should scan and get all the data, and just give it to people like us to do the data crunching. Right. It would save tens of billions of dollars of taxpayer money, literally. Because right now we're doing the, paying for the capture. We're doing all the analysis. We're doing the productization. Can you imagine? That's, I think, where we need to get to.Harry Glorikian: So let's jump back to the to the technology for a second. Where do you see this going? Because I just you know, every time I try to keep up with this, I'm barely able to. It's moving almost too fast in a certain sense. Right. So where do you see this going from a technological perspective? Is it resolution? Is it analytics? Is it predictive power? Or is it all of the above? I mean, I'm trying to if you were giving a visionary talk about where this is going in the future, where how would you frame it?Al Eisaian: I'll start, and then Jennifer can probably be much more articulate about this. Look, we've made our bets. 80 cents on the dollar for us in R&D and engineering goes to AI. We're making huge, huge bets on that. We keep hiring more people. And then maybe as an entrepreneur, I should stop that, but maybe not. But that's the bet we're making. On the capture side, I think there's two very promising developments that we're betting on. One is the ultra high resolution imagery below the atmosphere will continue going to these high flying drones that don't need bathroom breaks, that can fly 24 hours or maybe 48 hours a day and they can capture a 10, maybe 12 times more of the data that we need. And so obviously the cost will come down. I think the sensor tech, there's many, many great companies, both defense-related and nondefense-related companies that are working on sensor technologies that will blow your mind. And we can go to hyperspectral imaging, which now for disease detection and stuff like that becomes really valuable. So that's on the sort of like the physics side of things. Like flying sensors, hyperspectral. But I think the most exciting part is post data capture. That's everything that Jennifer and Jennifer's team does. And I'll pass it to Jennifer.Jennifer Hobbs: Whatever I try to give academic talks, I try to capture the minds of the other, the people in the computer vision and machine learning fields who might be doing stuff like self-driving cars or what have you, because there's so many opportunities to both make computer vision for agriculture better in the future. But I think, to benefit both the agriculture and the computer vision side, there are challenges because we're getting so much data, more data, more sensors, just more types of data. Right now, you're going to run into this point where, what if what if the information on a single field is a terabyte? What do I do with that? How do I how do I process it? How do I extract all of the information? What kind of methods do I use? If I have hyperspectral imagery coming in all the time and then I have all this equipment data and all this weather data, how do I make sense of all of that? And there are so many different avenues there to to explore. I think, I hope people in in the machine learning community get really excited about this and say.... It has huge implications for the agricultural industry, but it's a great domain for us to understand, to improve our understanding of computer vision. So I think as more and more data comes in, it just puts the burden on us to come up with methods that can handle this amount of data. How can I handle an image that's maybe 100,000 by 100,000 pixels fifty times during the season, where I have hyperspectral data, with all of this weather coming in. And I think that's a really exciting, exciting piece. And then I think that also prompts, on the hardware side, you see a lot of a lot of interest around the different chips, the different edge devices that are used to process these. I think it just encourages more and more of that in the future. And so it's, I hope I am optimistic that I think a lot of these challenges, ag will start to be a preeminent domain in computer vision that people, it's an area just like autonomous vehicles that people are really interested in because it improves our understanding of these methodologies in addition to changing the world.Al Eisaian: And you can't eat an electric car. You can eat an ear of corn.Harry Glorikian: No. Yeah, but I was always thinking about there are techniques and approaches that you're learning and taking that we can learn from. I just don't know if anybody's cross referencing the work or the papers that are being written. I'm sort of the geekoid, who's trying to read, you know, obviously the title captures my attention, but, you know, reading all sorts of stuff because I know that it's a tool. It doesn't matter what you're throwing it at, the tool will with a few tweaks might work well. So I'm trying to keep absorb all this stuff and hence the the conversation. Besides the fact that I think editing of crops or making changes in crops and then applying all the stuff that you guys are talking about, I mean, it is a combination. We're going to change the way the world is fed, over time.Al Eisaian: Absolutely.Harry Glorikian: Well, this was great. I look forward to staying in touch and hearing how the company evolves and again, how the technology evolves, I though I, I will probably always be struggling to keep up with everything that you're saying. But that's OK. That's that's part of my job, trying to understand what's happening and where it's going. So thanks very much for the time and look forward to hearing how this thing evolves in the future.Al Eisaian: Thank you so much for the opportunity, Harry.Jennifer Hobbs: Thank you so much.Harry Glorikian: That’s it for this week’s show.  We’ve made more than 50 episodes of MoneyBall Medicine, and you can find all of them at glorikian.com/podcast. You can follow me on Twitter at @hglorikian. If you like the show, please do us a favor and leave a rating and review at Apple Podcasts.  Thanks, and we’ll be back soon with our next interview.