Washington AI Network Podcast Host: Tammy Haddad Guest: Vice Admiral Frank Whitworth, Director, National Geospatial-Intelligence Agency December 6, 2023
Tammy Haddad: Welcome all of you to the House of 1229. How many folks are new here? Okay, great. Now you know the ladies have it. Welcome. Enjoy the night. I'm so glad you could all be here. Sachiko Kuno, who I think some of you know, great scientist, wanted to create this place to have. I think we need to kill the music. Okay, great. And she liked music, too. She actually has to have her make. But the idea was to bring women leaders together. And the idea is to have a place where women leaders can come together. Kathleen Buhle, who some of you know. has put this place together and really created a wonderful place. We are so glad you're here. And the idea was that we would bring, you know, government leaders, civil society, all the amazing people, Bob Woodward, to have all come here to talk about big ideas and what's going on in Washington and how we can make a difference. And I'm so thrilled that Admiral Whitworth is here. The head of the National Geospatial Intelligence Agency. Let's give him a round of applause. Thank you so much for being here. You are the first admiral here as far as we know. But Teddy Roosevelt lived here when he was under secretary of the Navy. How about that? Vice Admiral Frank Whitworth: Same wallpaper? Tammy Haddad: That's what Kathleen wanted to reenact that period of time. And I'm so glad you're here at Washington AI Network. We are here to bring people together to talk about ai. It's changing all of our lives and every which way we can even dream of, and you are on the cutting edge of all of that, and very, very important intelligence agency. And first of all, you have to talk about where you guys are now with AI and really how you interact with the other agencies. Vice Admiral Frank Whitworth: Thank you, first off, for the invitation, for everyone's being here. And it is a, this is glorious, an absolutely beautiful environment. So, for something this important, let me just back up a little bit about NGA, just in case people don't know NGA. So we are in charge of the visual domain, as far as intelligence agencies are concerned, for both the Defense Intelligence Enterprise and the entire IC. I'm the, what's called the Functional Manager for GEOINT, Geospatial Intelligence. We have about 14, 500 people. The majority are in the Northern Virginia area. About 9,000 or so are there. And then we have 3,500 in a concentration area in St. Louis. 500 in Denver. And then the swing arm, the difference maker as far as I'm concerned, of about 1,500 people that are at all the points of light that we have in DoD, especially those combatant commands. The services, subunifieds, they're all over the place. And that's what really makes us a combat support agency. We're more than just an intelligence agency. We're a combat support agency, and we're co located, breathing the same air, in the same meetings with those COCOMs, for short, COCOMs, Combatant Commanders, learning what they need to ensure that we collect on the right things and that we exploit the right things on time. Tammy Haddad: So you have two bosses though, right? Vice Admiral Frank Whitworth: I do. I work for the Secretary of Defense as well as for the Director of National Intelligence. On the DOD side, I report directly to the Undersecretary of Defense for Intelligence and Security, that's Ron Moultrie, but almost all of my emails go to two people at least. Sometimes I throw in the Chairman there just to ensure that, that he's aware too. Tammy Haddad: Can you talk about how you're using AI? Vice Admiral Frank Whitworth: Absolutely. So there's an interesting story here. We're not new to AI at NGA. I'm not going to speak to how much the intelligence and the terabytes from space have increased over the last 15 years. I won't give an exact number on that, but it's been substantial. And yet the number of personnel assigned to the task of breaking that out, telling the tale of what's happening on the ground has not gone up substantially in the last 15 years. So we've had something called computer vision, which is effectively using an algorithm to go through the imagery on your behalf to then triage, cue, to humanize to an opportunity, I need you to take a look at this as the machine, okay? The issue here is that you have to train the machine. This is the ML [machine learning] portion of what we're all here to talk about. And that's where we're very unique. The buck stops with us relative to the proper training of the machine. So you have to take the best in the business to ensure the machine learns correctly. Tammy Haddad: And how do you do that? Vice Admiral Frank Whitworth: Well, this is where wetware [human brain] comes in, because when you have people, and I signed letters of appreciation for people who have been at NGA in excess of 40 years all the time. We have people that love being in this business. They know the contextual framework for change, what is baseline activity based on what they see visually. And then, of course, what is anomalous. And it's the anomalous activity that we need to train the machine to [00:05:00] be on the lookout for. So the idea here is to be able to scale with further growth. As I mentioned, the number of terabytes coming from space has gone up precipitously. Exponential is going to be too strong a word, but over the next 9, 10 years, it's getting very steep in the number of terabytes that will come from space. So how do you scale accordingly? This is where a more concerted AI ML effort is needed. So, I'll introduce Maven here in a few seconds, but just to kind of give you an idea of what we're talking about. Most Americans, I think, will probably relate to computer vision and to the art of detection by a machine. When they experience global entry. If you're in a, or let's say you're on an international flight, and you're amazed that the machine recognizes you and confirms that you are who you think you are. So some labeling has occurred. Some labeling has occurred during the course of training that machine. If it were only that easy for us, because now let's think about what that particular machine has done. It's taken an image with very little else other than your mug, Okay. That's about 80% of the field of view of that particular image. 80%. That's not what we're dealing with when we talk about taking pictures from space, and I, I don't want anybody to apply this mathematically and try to now derive the resolution of a typical satellite image because it will not be correct. I'm giving some generalities here. Tammy Haddad: Okay. Vice Admiral Frank Whitworth: We're looking for things, activities, pieces of equipment that are two ten-thousandths [corrected: one five-millionths] of a percent of that field of view. So, to say that we're looking for a needle in a haystack is not saying it strongly enough. We're looking for a needle in a bunch of needles. Okay? So, this is the scale problem that we have, especially as you start having more and more images in multiple types of images, especially like synthetic aperture radar, which happens with even more periodicity. Vice Admiral Frank Whitworth: Electro optical, of course, the thing that we typically associate as a normal picture. There are going to be other mechanisms that we use for even higher end forms of imagery. So, when you think about that, that's why we have to have AI. I probably will not have more than 14,500 people, all these wonderful people doing what they do. We need to actually concentrate on training that entry level machine. And it will always be an entry level machine. But it's not going to get tired. And it's going to run maybe while we're asleep. And it's going to run all the time with things that we may not have prioritized, but it's, we're lucky if it says, this is an anomaly, I believe, based on my training, I need you now to pay attention to this. So this is the imperative of AI in GEOINT and specifically at NGA. Tammy Haddad: Where do you get all that compute? Vice Admiral Frank Whitworth: So now we're talking about a specific program. And this was a wise decision taken just in the last year and a half by the Secretary of Defense and by extension the Undersecretary for Intelligence and Security for what used to be called a project, Project MAVEN. Tammy Haddad: And how do you work with the other agencies? I mean, do you send them an email every day? Is there a briefer, like the famous CIA briefer at the White House? Here's our pictures today. Vice Admiral Frank Whitworth: So right now, at the stage of MAVEN, it's largely dedicated to the future of targeting and this issue of distinction. So everyone will have something to gain from Ensuring they have the best of breed AI for targeting and, and on this issue of distinction, let me just clarify what I'm talking about. In the laws of armed conflict, we typically [00:09:00] associate, and most people would agree, these elements: humanity, necessity, proportionality, and then this thing called distinction. By the virtue of being Americans and our values, we're typically reared to understand the first three. Distinction is hard. Distinction is different. We are not necessarily reared to be able to say, "Based on what I see, I, I deem that to be combatant, or I deem that to be non combatant." And that is the essence of what we call positive identification, which is, in my book, based on my 34 years of experience in targeting, the essence of targeting. To ensure you have the right target, at a point on the earth, set point with good accuracy, and obviously with a unique behavior. So, all of these agencies want the best for America to include ensuring that we don't cause a presidential apology with a misidentification. There will be layers of other INTs [intelligence disciplines] that help build confidence, but at the end of the day, absent those layers, we [00:10:00] have to be sure that the visual layer is correct because there are times that the visual layer is positive enough for fires, for a targeting solution to drop a bomb or something like that. Tammy Haddad: So are there classifications that have been memorialized and used in all NGA events or combatant events? Vice Admiral Frank Whitworth: The answer is, and you're really referring to confidence levels and in this tradecraft of intelligence, we, we go with the words like confirmed, probable, possible, very likely. And it drives people nuts, frankly, when they hear all of these terms because, typically, "Could you just speak English to me." well, they all have real meaning, they actually have real meaning and they're defined, and so the bar of distinction for positive identification is something that is typically written into the rules of engagement for that particular campaign or that particular episode. And that is a policy decision, and that will be a little bit of, kind of a moving target. But I will tell you this, the United States holds a very high bar on positive identification. And so, that's something that I've made my chosen profession to be quite honest. This, that's the essence of this stuff. Tammy Haddad: And then you feel confidence because it's already decided it's not like people walk around and make their own decision for each. The reason I ask is because everything that's happening in AI now feels like it's a whole new set of rules. People are looking for rules. We have some people from Senator Schumer's office here. They've been gathering people to talk about what the new rules should be, laws, guardrails, all of that. But you're saying it exists, or at least in terms of what you guys are dealing with. Vice Admiral Frank Whitworth: So when you're talking about positive identification, the essence of distinction, we are pretty hard on ourselves. It's true. So, we do have our own guardrails established in the wetware application of our tradecraft. And it's things like training ourselves against things like confirmation bias or taking liberties with a normal checklist approach to whether you are there in positive identification. But to riff on your question, and this is where AI, I totally agree with the need for guardrails, provided we're tempted to cut corners with AI. And right now, as far as humans, at least the humans at NGA, we're building a certification program to ensure that will never occur, that will be ethical. But I love also flipping this argument to suggesting that AI can actually provide guardrails. Tammy Haddad: In what way? Vice Admiral Frank Whitworth: This gets into, if I've got individuals who've been working for 18 hours. I tried to make that, like, never happen, believe me, okay, but every now and then, if you have a.. Tammy Haddad: I was going to say, it sounded like it was fun to work there until you said the 18 hours. Sorry, I could go, I could go 12 or 14. 18 is like way too much. Vice Admiral Frank Whitworth: I would hope that, I would hope that there are prospective [00:13:00] recruits listening, and I hope that they don't believe that that will happen. No, if, let's say we had a pop up urgent kind of episode, and we had people who have been working a long time, and they're getting tired. That machine is not. So if that machine, during the course of ML, the ML process, if we've treated that as a true digital apprentice, we're going to count on that machine to double check if we're getting tired, or if we might actually have a little bit of bias built in that we didn't realize we did. I love the idea of AI actually being there to provide just another set of eyes as its own guardrail. This idea of singularity and letting the machine take over, those are human decisions. And so the way that we're going to ensure that we approach this is very ethics bound, very certifications bound, totally in keeping with where DOD and the President now, by extension, with his executive order, 14110. We're, we're all in keeping with that. Tammy Haddad: Can you talk a little bit about that new executive order and how it affects you? Vice Admiral Frank Whitworth: It [00:14:00] actually speaks to exactly where we thought we would need to go, which is to ensure that we are working with a sense of urgency and purpose, but at the same time keeping balance on these issues, especially involving the responsible application of AI and ethics. Tammy Haddad: Okay, let's turn to China. So there's always talk about, "Oh my God, they're going to get far ahead of us on AI," or "They are already far ahead of us." Can you give us your sense of where they are versus what you're doing? Vice Admiral Frank Whitworth: So, if I look at this from a STEM production rate and possible recruitment rate for people who are of Chinese origin, they're way ahead. They are. The number of STEM related graduates far, is probably over five times what we experience in the United States. I think India is probably right in between, and so they're probably twice of what India produces. This is concerning. We are a very STEM oriented agency, and certainly people who [00:15:00] understand AI and understand this tradecraft would probably benefit from being STEM graduates. What I like though about the United States is how experienced we are with critical thinking and with a tendency to tell the whole truth and nothing but the truth first, right up front. And this has been tested. I, I would tell you as somebody who's been invested in, you know, multiple conflicts over the last 34 years, we have a really good tendency to get the bad news up the chain very, very quickly. And so during the course of the fog of conflict, the fog of war, we're going to opt to get that information up and seek clarity and to help our decision makers also seek clarity and know, and know before they make decisions. I don't know, and I can't tell you whether the Chinese actually have that advantage right now. I would not because a lot of this is predicated on experience in combat. Tammy Haddad: Let's turn to Ukraine, war in Ukraine and the role of AI there [00:16:00] and what do you think the opportunities are, the challenges? It's been, wow, like a year and a half to almost two years. Vice Admiral Frank Whitworth: So, I can't speak to whether the Ukrainians are independently applying AI. They're very resourceful. They're very IT savvy. Would not be surprised to find out that they are. I think we've got one or two reporters who are just there, and they could probably have more authority on that topic. I do know this. If there is a way to provide an advantage to the Ukrainians, that is our writ. Through EUCOM [U.S. European Command], we are providing a tremendous amount of information to ensure that they have what they need. We don't do the targeting for them. They make their own independent assessments. They make their own independent decisions on what they will actually neutralize. But we have a responsibility, as the president has stated from the beginning of this, to ensure that, that we're not holding back in, in the information that they need to make good decisions. Tammy Haddad: Have you learned anything from this conflict? Vice Admiral Frank Whitworth: I've learned how [00:17:00] important it is. And I had the advantage of having been the J2 for the Chairman, at the time it was Chairman Milley. When we received the first indications that we were going to have some sort of an invasion, right through the invasion in the first half year to a year of this conflict. So I was a principal consumer of NGA material and NGA assessments. And, and I've used this in other forums, but I'm going to repeat it. Ultimately, as far advanced as we are with our power of presentation, with showing movement on, you know, on computers, and you can take a laptop in or a tablet, we sometimes like just going with that good old fashioned two dimensional map. And there is this. Tammy Haddad: There's an NRO person here, by the way. Vice Admiral Frank Whitworth: Right. And we have cartographic experience as well. So if there's a map that is authoritative, typically it's produced by [00:18:00] NGA, at least for military purposes. And we made this thing called the Big Green Map, and this is while I was still at Joint Staff, that took all of the INTs and combined it into something that was just two dimensional and relatively compact that anyone who walked into any environment who needed to tell the story of what's happening on the ground always had the Big Green Map. I'm very proud of that because while NGA ultimately put the TRONs onto a piece of paper and actually printed this, it represented the team of the IC. And all the insights that came from everyone, whether it was SIGINT or it was HUMINT or obviously GEOINT, it was the culmination of a team effort. And we're very good at that, both with something that is two dimensional like a map, or something that's living and breathing like a geographic information system, a GIS system. And that's where if there are people who, you know, are listening during the course of this podcast and thinking about, "Well, what am I going to do with this geography major that's just this [00:19:00] specialization in data science or GIS?" Oh, they have a place to make a difference at at NGA. Tammy Haddad: So, the rest of the world is worried about deep fakes, fake everything. You have more data we've learned than anybody else in the whole world. Is there anything that you're using that could be used for? Commercial purposes, or for my, you know, 16 year old daughter, anything that translates. And by the way, get your questions ready, we're going to take some audience questions. Vice Admiral Frank Whitworth: This issue of validation is very important. I appreciate your asking that. So one of the things that we do have is a group of people dedicated to open source imagery. And there's a lot of it out there. And as you know, an image can be very powerful. Especially if it's misleading and, and the consumer doesn't realize that. So what we do is we go through, and it's typically from social media. It might be through the press. But typically the stuff you want to double check is that stuff that comes through unevaluated [00:20:00] press or some social media that may not be as reputable. But if it's really important, then we will actually put that product out with a validation index. So we're looking for other indications of maybe some manipulation, some forensics in there. We're looking for evidence on the ground that's not quite right relative to what we know is on the ground. And I tasked them about a year ago, this team. I said, "Hey, don't just put it out as a little product. Put a scale on there." One, invalid. Absolutely not. We're finding things inconsistent with reality, inconsistent with the truth. Five, valid. Everything seems to be checking. I have found that our consumers have, have really enjoyed at least having a hint of whether that's a, you know, something that is correct or not. And I've said this all along, I think you would appreciate this. You started in the intelligence business, Mr. Woodward. We could trade, we could trade with any journalist and after going through the sourcing and going through some of [00:21:00] the methods probably trade jobs within a couple weeks. It's all about the sources you use, otherwise Bob Woodward: How about tomorrow? Laughter Vice Admiral Frank Whitworth: It's all about corroboration and evaluating our sources, right? Bob Woodward: So, yes, but I would say, you, you referred to pop up episodes. Could you give us an example of a pop up episode? It sounds like journalism. Vice Admiral Frank Whitworth: It can be. And you know how emotional some of these images can be. So, if there is an image that is fomenting potential violence or unrest, we're going to want to make sure that that's a correct image. So I, I let our team evaluate, we can't evaluate every single image, but for the ones that we think would have a policy implication or a violence implication, we have a duty there. And I love the, that I'm surprised sometimes by what the team actually puts out and chooses to become a bit vulnerable with this validation [00:22:00] index. Tammy Haddad: I like the validation index. Kellee, what do you think? Is that a good way to do it for, The rest of us? This is Kellee Wicker from the Wilson Center, who's doing tremendous work on AI. Kelle Wicker: Well, thanks for calling me out, Tammy. Actually, it was one of my questions, it was going to be, you know, as we see, after the executive order, the move toward government agencies implementing AI, you know, thanks to Bob Work and Project Maven, you guys are well ahead, understanding where to use AI systems and where to use the wetware. Do you feel like you'll be able to help advise other agencies on finding that divide? Or is there a playbook we can derive from that? Vice Admiral Frank Whitworth: I do, but I don't want to over commit or promise that there is total alignment between us and the other intelligence agencies. There's going to be a different form of AI for SIGINT. There's going to be a different form of AI in its application to HUMINT. However, there are some best practices that will transcend all of that. And we will absolutely [00:23:00] commit ourselves to ensuring that those lessons are made very transparent to the other agencies. Tammy Haddad: Before we go to the next question, I want to ask you about the fear of AI. I wonder if you can give any guidance because there's so many people that are just worried that it's the end of the world. And you have such a strong, positive outlook. And even as you're recruiting, you spend a lot of time talking about the gift of AI or how you've used it effectively. How do you fight against that? Because it's true within the military, maybe not in your department, but, we've heard about it throughout the services. Vice Admiral Frank Whitworth: It's natural for something this new and this potentially revolutionary to be something that brings fear. You're right. I'm excited about it, but we started very early with what we call GREAT, GEOINT Responsible AI Training, basically a certification for anyone who touches the code associated with AI and ML, needs to be certified[00:24:00] to ensure that our basic principles stay basic principles, and that we don't overreach. This is something that I mentioned in a large forum within the last six months. Is that staying principled in the midst of all this will ultimately be the way to get through it, right? Because at the end of the day, a human will be on this loop. If there's ever a time that a human chooses to be out of the loop, that will be a conscious decision. That will have to be at the highest levels, and I have not gotten that guidance. Tammy Haddad: But these long, large language models already have the input. It's already there. What do you do with that? Vice Admiral Frank Whitworth: So, this is again where you have to make sure that we're Guarding against things that, I'm going to use the term loosely, hallucinations, literally by by some of these models to ensure that we're not seeing them lead us astray. This is why the wetware will always be needed to kind of compare notes with what you might be getting from a large model. In the [00:25:00] visual sector for GEOINT, what I'm excited about, you've heard about these large language models. We anticipate, soon, large visual models. So instead of a, a visual detection that this is, I don't, this is a nice you know, glass of water, it's just, it just tells you that it's a glass of water, it might actually say it's now, it's three fourths full. It will give you some context as to the behavior, or it's now moved about four inches to the right. That'll be a phrase. So that's kind of the equivalent in our world compared to what you're seeing in such an exciting way with LLMs. Tammy Haddad: Wait, jesse. Where's Jesse Kallman (CEO of Danti)? Is that what you're doing? Would you say that's comparable to what you guys are doing? Jesse Kallman, CEO of Danti: What we're doing is more of providing, instead of looking within the image and saying that's a glass, we're looking at, we're looking for what is that glass doing and how do you correlate different types of information to provide the context. So the human isn't having to do that, the human is focused on exactly what's going on and not having to go through 10, 000 loose threads.[00:26:00] Vice Admiral Frank Whitworth: That's it. Yeah, that's very exciting. It's all about context and layering. It sounds like you've got some layering going on. And obviously, as an intelligence professional, we always seek additional layers to add competence. So I think that's exciting. Tammy Haddad: And what about people worried about the five big companies that are ruling AI or like to say they're ruling AI? Does that make you nervous as opposed to coming from government or leading from government? Vice Admiral Frank Whitworth: This is so big that I'm not sure I would necessarily gravitate to somebody who says that they, so what's the term? Did you say they are leading? Tammy Haddad: Well, they're leading the way on AI and governments trying to catch up. Sounds like you guys aren't, but does that concern you that so much of that research and so much of the work that's been done is concentrated on a few? And even, you know, like I was over at [00:27:00] Johns Hopkins University, they're going to hire like 185 researchers. They can afford to do that. But most universities that do traditional research, they don't have the money to do that. So, you know, you're talking about this, there's a class that already exists that's really successful. Vice Admiral Frank Whitworth: Now, thank you. At this juncture, I would say it's exciting that there are a few that have the resources to invest in the R& D. There will come a time, though, that you're going to also want to spread the wealth and ensure that we're getting the best from everyone. And I like the direction that we're moving in the MAVEN program to ensure that we are soliciting as many opportunities out there by smalls as we can. Now, that will have to be tempered by some security issues. We'll have to make sure that as we protect these algorithms, that we don't get too far afield. We don't want to go back to being just a project that's so flat and apparent that, frankly, you find your algorithms getting stolen. Tammy Haddad: So you're looking to work with industry? Vice Admiral Frank Whitworth: Oh, absolutely. [00:28:00] That's the name of the game for us. Tammy Haddad: Okay. Steve, let's pass the mic over. Steve Clemons: Hi, Steve Clemons with Semafor. I want to thank Tammy for this. This is fascinating. I know this is not something you Thank you so much for doing it. You know, I, I spent a lot of time with those companies that Tammy just talked about. If you were to talk to one of their critics, Meredith Whitaker, who's the CEO of Signal, she said they've built up their business models based on surveillance. And so they have a monopoly, essentially an oligopoly, that small firms can never have. And so it's one of the differences why small players, she said, can be barnacles on the boat, but never be the boat. But the thing I'm really interested in is the next generation of this. And, and if you talk to competition between Google and, you know, the, the, the big players in this area, they say the next arena is, is, is the hypercloud, not the cloud. The hypercloud is the place that big data, um, AI, quantum comes in. And hyper war is managed in a hyper cloud. It's the next big [00:29:00] defense industry sale. And it's what they see happening on the ground in Ukraine. They see the synthesis of real time intelligence, real time targeting and response to the Russians, without human beings involved. And the instantaneous move from the place where Ukrainians are shooting their weapons, instantly moving to other places. And the Russians not having that ability. So we're seeing a real live version of the convergence of these technologies in Ukraine every day. And I'm just interested in that level because it really is a pathway to counter what your point is. There's very little human interaction in that. It's all instantaneous, data driven. I don't know if I'm worried about that or excited by that, but I just want to ask you where you're at, because I know business is driving to that instantaneousness, where there's no way human beings can be involved in that process, if to be affected. Vice Admiral Frank Whitworth: I love this question. Again, I don't have any guidance to take a human out of the loop, or, you know, right now, I think on the loop is a term that's being used. [00:30:00] But I take your point that there could be some forces out there that choose to take humans out of the loop with mission orders, autonomous kind of mission orders with vehicles. So, here's where I am. And this kind of speaks to what I've termed as a reluctant RMA, a Revolution of Military Affairs, that deals with the unmanning of a lot of power projection. Not all of it, mind you, but a lot of it. And we, it's reluctant because frankly, we're, we're really invested in those as humans in, in minimizing warfare. But if you got to go for some sort of warfare, we are really invested in doing it best as humans. And we are invested in being out there. It's kind of Steve Clemons: This is not like the B2 bomber debate? Before drones? Vice Admiral Frank Whitworth: Yeah. I, I think that there's a lot of that. Again, I, I use the term reluctant RMA, but I'm also encouraged when I see the press reports and the releases coming from the Deputy Secretary of Defense on this initiative to actually seek more autonomous vehicles. The replicator initiative, which I think is really [00:31:00] exciting. So if we're, if we're worried about the potentiality of a, or the potential of a foreign actor going for autonomous mission orders, I have written and spoken in one or two speeches about maybe an offset of autonomy from a self defense perspective. So there the wetware will still be necessary to train the algorithm to now go after, to go after these particular autonomous vehicles as, as a way of neutralizing that. I could see that as a new way of thinking. That's brand new. That's, that's very, very hypothetical. I don't know if that address. I hope that addresses your point. That's kind of where I am. I could see that going in that direction. Tammy Haddad: John Hudson. John Hudson: Admiral, John Hudson, Washington Post. When you mentioned in your opening remarks about the United States having a very high standard, when it comes to targeting and when it comes to proficiency of targeting when it comes to fires. [00:32:00] When you look at the last two months of what's happened in Gaza, how do you compare U.S. standards for positive targeting and the government of Israel? Vice Admiral Frank Whitworth: Yeah, so, I appreciate the question. I'll just offer that any sort of opinion relative to the conduct of what's happening in Gaza, I'm going to reserve as the, you know, certainly the responsibility of the Secretary of Defense. of Defense, Secretary of State, and obviously the President, and they've been very outspoken on that. Let me tell you what we do to contribute to their knowledge. We just tell it like it is. As I like, I use a baseball analogy: we call balls and strikes. And so while we're very committed to ensuring that the State of Israel has everything they need, and that is something that the President has articulated from the get go is certainly a mandate for us. We also tell the story of what is happening on the ground, and we don't hold back. There's no selective editorializing in what we choose to cover [00:33:00] and what we don't. I'm really proud of the team. From the beginning, that's been our tendency. John Hudson: When you say you call it balls and strikes, is that, that that information is given to U.S. leaders or is it given to the leaders of Israel? Vice Admiral Frank Whitworth: So, I won't comment on intelligence sharing, you know, the President has indicated that there is intelligence sharing that's helpful to Israel, but I, I do absolutely have a responsibility to ensure that the President, the Secretary of Defense, the Chairman, everyone in the IC, Secretary of State, and the interagency, that they have the best information. Bob Woodward: Would you give them targeting information? Vice Admiral Frank Whitworth: Them being, them being? Bob Woodward: The Israelis? Vice Admiral Frank Whitworth: We wouldn't necessarily call it a target, sir, we're not involved in that type of thing. So all we do again is to ensure that they have the information that they need. Bob Woodward: That's not even a half answer. (Laughter) Vice Admiral Frank Whitworth: Thank you. Tammy Haddad: Before we go. Oh wait, Teresa. Bob, will you pass the mic back there. Teresa Carlson: Sir, love, love what you're doing. [00:34:00] I spent a lot of time working with NGA when I was at Amazon. And, the cloud, cloud is a big part I know of what you do. I'm a general catalyst now, so I get to invest in all these great companies doing things to help our U.S. Government, but can you talk about how you're using now edge capabilities within AI? Most Americans really, I don't think, understand the importance of what NGA actually does for the mission. But edge capabilities, large language models in edge, as we use drones for war fighting and targeting. Can you talk about how you are looking into edge capabilities at NGA and making that more easily and readily available for the type of information that you have to pull and deliver? Vice Admiral Frank Whitworth: So we, I want to make sure I don't misinterpret [00:35:00] your definition of edge. I'm going to give you my, when I hear edge within the confines of NGA, it's typically associated with last tactical mile and with ensuring that there is redundancy and resiliency of, of networks. As I, as I mentioned, we, we move so many ones and zeros, we have to be able to ensure that if that is denied in one particular portion of the network, that it doesn't bring down an entire network. And so we have chosen a joint regional edge node approach that will be instrumental to ensuring that if we lose something somewhere, you don't lose it all. I, I hope that makes sense. Teresa Carlson: It does, and through satellite capabilities now as well, you're using that more readily with edge in the field, with our military, they had compute capabilities, but now, their ability to take advantage of AI through large language models with that same edge capability in the field with the warfighter. Vice Admiral Frank Whitworth: So, with, through exercises especially. So as we develop some of these applications and make them through graphical user interfaces, GUIs, to ensure that a normal user can actually use it with an advantage. Tammy Haddad: What's a GUI? Vice Admiral Frank Whitworth: So, that's a, that's literally, that's like the application that's on your screen. So, as you manipulate and you enter data, that's an interaction that you're having, a graphical user interface. So, a GUI. Maven has one that's called Maven Smart System. And it's actually something that is relatively easy to use and we do exercise with the Maven Smart System. And so, sometimes we get so attracted to the GUI, we forget about the level of compute and the power of what goes into the output. Now that's what we get very excited about, but believe me, we have a lot of consumers in the combatant commands are excited about the GUI, about the Maven Smart System. And that's okay. That's okay. As a targeting professional, you want something that keeps your workflow organized. And that's the power of a GUI. It keeps you organized as you also have this huge compute power. Tammy Haddad: Do you use the GUI for climate change? Because I think, one, to just move things a little bit over, I think that what you're, you know, people are learning so much about what you do based on that. That you're showing us how things have changed. Vice Admiral Frank Whitworth: No, this is a great question because while we don't use Maven smart system or Maven for this, we use an aspect of AI, relative to ice. And we have a responsibility to show change. Tammy Haddad: To ice, but not ICE at DHS, right? Just to be clear, for those who aren't paying particular attention. Vice Admiral Frank Whitworth: Thank you for that clarification. No, we're talking about ice at the polls and the change of the earth, right? And so the amount of change at the polls is obviously substantial. A lot of people don't realize we have a responsibility for mean sea level. We have a responsibility for what we call WGS 84, the geodetic system for the earth that we use for all of our maps as to what is the precise place on the earth that something's happening. We do that. So when you're talking about elevation models and ice, you're talking about a lot of data, especially from synthetic aperture radar, tons of data that's showing that the ice is changing. Compared to, say, sea level, and that above sea level, we have established something called digital elevation model, and we have put that out now, and it took some changes in contractual terms, but now we're putting it out to academia, so that everyone has access to that. That's very, very powerful for people who are looking at the rate of climate change. Bob Woodward: And how screwed are we? Tammy Haddad: Yeah. Vice Admiral Frank Whitworth: I'll let someone else, you know, I, that's a nice thing, it's just data. And we're responsible for the sanctity of the data, we will not go into the why or the how or what to do necessarily on that, okay, we'll let the others do that. But there's so much data, we are applying some elements of AI to that because it's too much for a human to do on a daily basis, so it does save time in that regard. Tammy Haddad: So you have all these incredible pictures, you've got the ice, you're putting it together. Do you share it with other governments? I know you're not putting analysis on, I guess the question is, then say, hey, we'll be much better off if we slice and dice it this way. I mean, how much leeway do you have? Vice Admiral Frank Whitworth: Well, actually, we put a lot of pressure on ourselves to ensure we have no leeway. To ensure that we are, it's just balls and strikes. Just call the balls and strikes which in this case is, what's the mean sea level? What's the elevation model? What's the thickness of what we call permafrost? These things are changing. And we have a responsibility to just say it like it is, based on good data. Sometimes people will go places and they might get bad data that might be framed or shaped to an outcome. We have no outcome based expectations here. Tammy Haddad: I think you have the best job over there. I gotta say, no analysis required. Wait, one more very fast, if you're very fast, Daniel Lippman, Politico. Daniel Lippman: Daniel Lippman with Politico. I have a quick question. How worried are you about wars in space knocking out your capabilities? The next time we have a conflict, we're going to get blind, basically. And what are you guys doing, too? Vice Admiral Frank Whitworth: We're worried enough that we wanted to make sure we communicate with the American people that it's from seabed to space, so we changed our motto, literally. It used to be very focused on the rock itself, of the Earth, and we wanted to ensure that there are, there are domains where, this issue of distinction and warning do apply that normal Americans don't see. Such as space and such as the seabed. And we do have responsibilities there. So we went ahead and added from seabed to space into our [00:41:00] motto. So the distinction of behaviors that are either responsible or irresponsible in space, that is in our purview. But that's about where we stop talking because the how gets extremely sensitive. And so I hope that everybody will take my word for it. We have people that are worried about behaviors in space. Bob Woodward: When were you founded? Vice Admiral Frank Whitworth: So that's a, it's a great question that goes into the history of the Defense Mapping Agency and even before the Defense Mapping Agency. Some of the charting episodes that came during World War II. Which is where we get a lot of our history in St. Louis. And so there's a very exciting story that will probably be, that'll be another session on what's happening in St. Louis with the ecosystem there. That's why we have 3,500 people in St. Louis. But then we became NPIC [National Photographic Interpretation Center], which was effectively part of CIA. And then we became NIMA [National Imagery and Mapping Association]. That was largely founded on the need as a result of the First Gulf War 1990 timeframe, where people weren't getting the GEOINT they needed at the time they needed it. And so Congress mandated that we have a NIMA at that time, an agency dedicated to GEOINT. And then it was renamed to NGA, which was the combination of those cartographic responsibilities of DMA with NIMA, and we became NGA. And so GEOINT is kind of the all encompassing term now. Tammy Haddad: Well thank you so much. Please join me in thanking Admiral Whitworth. It's absolutely fascinating.
We recommend upgrading to the latest Chrome, Firefox, Safari, or Edge.
Please check your internet connection and refresh the page. You might also try disabling any ad blockers.
You can visit our support center if you're having problems.