Distributed Energy & Edge AI Infrastructure. Karl Andersen on the Future of Compute

In this episode of Arthur’s Round Table, Karl Andersen explains why the future of AI may depend less on software—and more on energy infrastructure. As hyperscale data centers strain aging electrical grids, Karl shares how distributed energy, decentralized compute, and edge-scale AI infrastructure could become the next major evolution in both energy and cloud computing. From solar-powered homes to localized AI networks, this conversation explores how the collision between AI and energy is creating entirely new markets.
________________________________________
🎯 What You’ll Learn
• Why AI is creating a massive power and grid crisis
• The difference between hyperscale and edge-scale data centers
• How distributed energy can power AI infrastructure
• Why solar and battery systems are economically underutilized today
• How decentralized compute changes cloud architecture
• Why the current electrical grid was never designed for AI demand
• How homeowners and businesses could monetize excess energy
________________________________________
🧠 Key Insights from Karl Andersen
________________________________________
1. AI Is Creating a Power Crisis
The rapid growth of AI and hyperscale data centers is overwhelming electrical infrastructure.
👉 Existing grids were designed decades ago for a completely different level of demand.
________________________________________
2. The Grid Was Never Built for AI
Karl explains that today’s electrical grid was built in the 1960s for:
• homes
• small businesses
• industrial loads
—not modern AI systems consuming city-scale levels of power.
3. Hyperscale Data Centers Create Systemic Risk
Large centralized data centers:
• require enormous power loads
• create grid instability
• introduce security vulnerabilities
👉 one facility can consume as much power as a major city.
4. Distributed Energy Is the Unlock
Karl’s thesis:
👉 instead of sending energy through inefficient grid infrastructure, bring compute directly to where energy already exists.
Examples include:
• homes with solar
• businesses with excess rooftop power
• farms and industrial facilities
________________________________________
5. Edge-Scale Data Centers Could Reshape Cloud Computing
Rather than building giant centralized facilities:
👉 edge-scale infrastructure distributes smaller compute nodes across thousands of locations.
This creates:
• lower latency
• faster deployment
• greater resilience
• more flexible scaling
________________________________________
6. Solar Economics Are Fundamentally Broken
Many solar owners:
• produce excess energy
• sell it back cheaply
• or get curtailed entirely
Karl argues compute may become a far more valuable “offtake” for power than traditional utilities.
7. AI Compute Is Becoming a New Energy Market
A major insight from the conversation:
👉 electricity itself may become an investable digital infrastructure layer.
8. Decentralized AI Could Increase Resilience
Edge infrastructure creates:
• redundancy
• geographic distribution
• lower vulnerability to outages or attacks
👉 compared to centralized hyperscale systems.
________________________________________
9. Energy Could Become an Income-Producing Asset
Homeowners and businesses may eventually:
• monetize excess solar
• host localized compute
• participate directly in AI infrastructure economics
👉 similar to how Airbnb monetized unused real estate.
10. The Future Grid May Be Decentralized
Karl describes a future “Grid 2.0” where:
• homes
• batteries
• localized compute
• microgrids
• AI systems
all interact dynamically through distributed infrastructure.
👤 About Karl Andersen
Karl Andersen is the founder of Lektra, a company focused on distributed energy and decentralized AI infrastructure. With a background in finance, energy markets, and investing, Karl is developing edge-scale compute systems designed to connect distributed energy resources directly to AI workloads and cloud infrastructure.
_
Want to learn more about Family Office Insights? Click Here.
Unknown Speaker (0:00): Hello. Welcome everybody to another episode of Arthur's Roundtable. Totally grateful for everybody who's paying attention and share it and all that business. And so we super appreciate it. Thank you for listening in.
Unknown Speaker (0:12): Carl, we have a guest today. Carl and I have actually known each other right before COVID when we did a conference at the UN for our impact friends, which was, you know, it ended up being a good conference, not because I did it, but because it was full of really interesting family offices and we had people speaking about rather than getting up on the stage and talking about how the world's going to come to the end, they came up and said, here's our construct, here's our business, here's how we can demonstrate we had impact, And they could prove it. And so it was a little different than most things that people come up and say, the world's going to come to the end and get off the stage and have no solution. So Carl and I had a chance to speak prior to that. And so I'm glad to see him come back and participate in the podcast with us.
Unknown Speaker (1:02): And he had you're going to find this out, but one of the things of note is that before we had a compute crisis, and an energy crisis, and a power crisis, he's a method of addressing that problem. You're going to hear about it today. Is that fair, Carl?
Carl (1:18): Yes, absolutely. No. Thank you for, thank you for having me. And yeah, there's plenty of problems we can identify to try and solve. So yeah, no, but thanks for having me.
Carl (1:29): I'm, You know, you know, by way of background,
Unknown Speaker (1:33): you know, Yeah, let's start at the beginning a little bit if you don't mind.
Carl (1:36): Sure. Yeah, yeah, yeah. No, totally. So, you know, basically this energy crisis has been, it started sometime back. It started actually you know, we started thinking about this grid constrained stuff back in like 2018, 2019.
Carl (1:53): So I'd been running well, I've been in finance my whole life, run been a banker, run a hedge fund, run a broker dealer, run a reinsurance company. So I've been sort of an entrepreneur in the in the financial services space. And then my last fund was really looking at the energy transition. And, you know, in Norway, Norway was principally where we were investing Norway in the Nordic region. And as we saw, and they were kind of leaders in the energy transition because they're an oil country by background.
Carl (2:27): Domestically, 98% of their domestic power is hydro. And then they, you know, whether you're right or left in the government, didn't really matter. They went all in on this, you know, electrification idea. So much so that now 70% of cars on the road are EVs. Oh, as you can imagine though, in a small country like that with that much power floating around, that much demand, it started causing a lot of strain on the grid.
Carl (2:54): And a lot of people had started getting into battery trading early on or high frequency trading, which is really just, you know, the the just like markets, there's volatility on a grid, and so people, you know, sort of started taking advantage of that. But we we took a step back. So my fund, you know, we launched in about 2020, and we started following this. And we're like, you know, every single problem we're having in terms of energy, distributed energy, and that was the main investment thesis was indistricted energy, It keeps bottlenecking at the grid. And so it eventually dawned on me that if you think about energy markets, there's really two main markets, oil and gas, and it's a competitive market.
Carl (3:33): And then you get to the grid with electricity and it's kind of a monopoly. And so, you know, all these, energy assets that we're creating and whatever, they all end up bottlenecking at the, at the market of distribution. Right? In distribution. And the problem with that is like, so something like solar, a technology that's been around for thirty plus years, that's gotten super cheap, almost around 3¢ a kilowatt hour, still looks unprofitable to people.
Carl (4:00): So like in the state of New York, if I sell solar back to the grid, I'm getting 6¢ a kilowatt hour. Well, that doesn't look like much, and I keep going to the federal government lobbying, you know, blah blah blah. In Norway, we decided, well, what else can we do with this energy? Maybe it's the offtake that's the problem. And I think that was the key unlock.
Carl (4:17): It was like, well, how else do we sell or use this energy for something else? And so we started testing it on like charging stations, for example. You know, you had a lot of cars there, pretty consistent, but that wasn't like enough. We're like, who is the next, where is the best, next big off taker gonna come from? And so this was about a 2023 and over the horizons, see chat GDP coming over and we're like, Actually, they're gonna demand a ton of power.
Carl (4:46): And so as we started really looking into the grid and, you know, the addition of these hyperscalers to get onto the grid, I mean, even still, like in New York, I can just explain. We already knew that there would be great capacity constraints by 2029. With hyperscalers, let's move to the end of this year. So we knew that the grid issue is gonna be a problem and that decentralized energy was a solution to this. So we had written the patents basically on all distributed energy as it relates to powering GPUs, TPUs, DPUs, and 16 other processing cards.
Carl (5:18): Cars. This was kind of the solution that we marry distributed energy assets to what we now call edge scale data centers as opposed to a hyperscale data center. But for people who don't understand the grid, I could take a step back because it took me a while to really understand grid dynamics. I mean, these things were built in nineteen sixties, and they were meant for people with two TVs and six lamps. And transmission central generation meant to go out and then some industrial uses, but this was really the rate payer.
Carl (5:45): The industrial payer was just an end user, and that was it. And then you sort of flash forward into sort of the renewable market and you have people then participating bidirectionally. And so there's something called inertia. It stays has to stay at, like, 60 hertz, the grid. And so when you're adding more power on, it can disrupt the wave sign of the of the grid.
Carl (6:06): Now let's add in hyperscale data centers. So hyperscale data center is basically like adding the whole city of New York onto the grid and then turning the lights on and off all at the same time in milliseconds. You know, like you at your house, have a power surge, lights go out, you have to go to the basement, flip the switch and, you know, trips the circuit breaker, etcetera. Same thing can happen in a city. But there's much more damage that can happen when you have, you know, all of a sudden you calls cause these massive waves and surges.
Unknown Speaker (6:38): Formers and that sort of thing.
Unknown Speaker (6:40): Those can blow. You know,
Unknown Speaker (6:41): have damage. Is the inertia? I'm sorry to interrupt, but there was some sort of limitation. Like if you wanted to put a Tesla roof on and you had 10 batteries or three or whatever the number was, the utility would limit how many batteries that you could get for some reason. I don't know what that reason was.
Unknown Speaker (7:05): Is that have to do with the inertia that they didn't want to buy back that much because of the influx and the inertia? Or was it something else?
Carl (7:13): Well, actually they want more batteries on the grid. It could have just been that they have too much. It could have had some element of that, because you do slow inertia sometimes, but batteries don't do that, but just solar does that. It may have just been who knows? They have a lot of things that go into why they have certain measurements on them.
Unknown Speaker (7:31): Fair enough. Yeah.
Carl (7:32): But, you know, the batteries are useful to the grid in the sense that then the utility grid operator can draw on power when needed. And this is actually part of a strategy that some of the companies we invested in before in the fund, were all about, a company called TGN Energy. They do high frequency trading. They actually stand idle, and people can, the utility grid operator can draw on that power if needed. So that helps flatten the, flatten the the wave sign of the grid.
Carl (7:59): So so, yeah, it's really that AI is testing the not so much the capacity, although there is a capacity constraint to it, but it's testing really the architecture of the grid. And so, you know, this is just the beginning. Like, AI, we so there's twenty four month lead times with AWS. And we were just at GTC, met Jensen Wang, you know, finally, which was cool, leather jacket and all. But, you know, it wasn't about chip lead times.
Carl (8:24): It wasn't I mean, there's been a lot of talk about, you know, VRAM and, you know, memory chip cost pricing and this sort of stuff. But the real problem, was power and and trying to find additional power. Right now, there's just not enough, power from the grid to be able to build out all these hyperscale data centers. And so where are they gonna go? So, you know, most of the hyperscalers have gone bigger.
Carl (8:47): They've tried to fire up, you know, Three Mile Island. I think Microsoft is gonna get that deal. They're buying up gas turbines. You know, they're building bigger. We said that's kind of the wrong approach.
Carl (8:59): There's there's two dynamics here that we can fuse together to solve this problem. And it's basically all these people haven't been making money off their solar. So, you know, that's the big problem. And then the lead time to get the power off the grid, it's just not gonna be there. So what we came up with was, something called edge scale data centers.
Carl (9:17): Instead of hyperscale, we're edge scale, and that's defined by 20 megawatts and below. 20 megawatts because there's moratoriums now in the state of New York, Maine has declared one, or you can't build hyperscale. So and they're taking off so much power at the expense of ratepayers, obviously costs are going way up. So what we do is we then marry all of the distributed assets which are either stranded or just not making enough money. And we basically put like an edge scale data center at someone's physical house location or their business location.
Carl (9:50): So in the case of a home, these things are like the size of a desktop computer. We are partners with Dell. We just order it. It comes to your house, plug it into the wall, connect to high speed Internet, and that's actually part of our decentralized network. Or if you're a business, let's say it's, a Lowe's or Home Depot or whatever, five megawatts of rooftop solar, you can fit about a megawatt of servers in your site location so you co locate it.
Carl (10:15): Then through dark fiber or fiber or whatever, you also connect to our cloud. So our cloud is a decentralized, a totally decentralized cloud, but now we can build where the power is. Turns out it's cheaper instead of sending electrons through a distribution channel like the grid. We actually just send the data through fiber and, you know, you eliminate, all those transportation costs. So now solar, which is 3¢ a kilowatt hour is making between, you know, a dollar to $52 an hour versus, yeah, 6¢ a kilowatt hour.
Carl (10:45): Or in the state of California, you even, you know, you get curtailed like 40% of the time. So all this net metering stuff, all this where you were promised you'd make all this money even in California, that's all going away. So but we're linking together those two things, and we give 80% of our revenue to the energy host or what we call an energy host. So Lectra is ultimately two businesses. It's it helps these energy hosts, homeowners, and the like.
Carl (11:10): And then we have our cloud business, which feeds the energy hosts. So the cloud company then offers GPU rentals, inference, fine tuning, you know, etcetera. They then pay us and then ultimately the energy hosts, and they get 80% of the revenue. So to to give you another metric, a gigawatt of power from a utility, they make about $80,000 per gigawatt. If you convert that to compute, you're gonna make anywhere between 750,000,000 to a billion dollars on that same gigawatt.
Carl (11:41): So 80% of that goes back
Unknown Speaker (11:42): to That's a huge delta.
Carl (11:45): So that that's what I was saying. The the dislocation of power, the arbitrage between power and the demand for compute is enormous. So that's has to do with the inefficiency of both the grid and then the the need for power from the compute side. So and we're kind of in many ways, we're sort of disrupting the energy market in the sense that we're almost like the shared economy for we're allowing any merchant of power to enter into the power markets. So kind of like an Uber was to taxis or Airbnb was to, you know, hotels, and apartments.
Carl (12:18): We're doing that with energy and offering, you know, a third way for people to to make money.
Unknown Speaker (12:23): So the building blocks, to tactically think about this is I'm a company I'm Home Depot. I've got the solar panels. I've already did the CapEx for that. I'm using as much as I can. I can't sell it back to the utility because they don't want it or whatever.
Unknown Speaker (12:41): Right? Then where do you plug into that?
Carl (12:45): So yeah. So we take like a server. We install it at Home Depot and maybe their server room or utility closet, whatever. And then the extra energy that's coming from that building or from the MetSolar runs the processes of the GPUs at the site location. It's literally like for a different customer.
Carl (13:05): For no. For Elektra's cloud. So think about it like this. You take a hyperscale data center, break it up you know, you have the racks. You break it up into its parts.
Carl (13:13): And instead of having one power source, now we're taking, like, smaller pieces of the hyperscale data center, and we're putting it at where the power is. So the power is just running it locally. It's like a big computer at your house. Yeah. Just a bunch of
Unknown Speaker (13:27): you know?
Carl (13:27): And then you're connected through fiber, and that fiber connects you to our electric cloud platform. And so what's cool, what's different, what we do in we're different than any other cloud in the world is that when you go to Lectra Cloud, you you you have a prompt bar just like ChatGTP. So you type in, like, pandas in space or something. I can even share my screen and show you, but I don't know if we're set up for that. But, basically, you type in pandas in space.
Carl (13:52): You'll get some images, and you pay for those images. But beneath that, you get these different bars, and that's the different power sources that are coming in. So it'll be like four different GPUs from four different site locations that rendered those images. And then it tells you the fuel type. So it'll say solar, batteries, solar batteries, hydro.
Carl (14:11): It'll give you the type of fuel you just used. And I could do we've trademarked this called carbon free compute. In real time, you know that it's green. So you we put a logo on it and then you're certified carbon free. So it answers that question too of is AI environmentally sound?
Unknown Speaker (14:28): Well, you know, now you can make a bunch of AI slop on our, on our platform environmentally free.
Unknown Speaker (14:33): So give me an example. You don't have to tell me who it is. But somebody needs compute. And they'll just tap into your cloud when they need it.
Carl (14:45): Well, then just, yeah, our our interface is just like any other cloud, like ChatGTV. From from the user point of view, you don't know where any of this is other than you see the different bars that are that are working. Now in the future, you know, what is unique about us is we're actually going hyper edge scale. Meaning, right now okay. AWS has a 130 site locations globally.
Carl (15:11): And so they actually, Google also has a bunch of locations. Google has owns Waymo, for example. And so they have a hyperscaler that deals with Waymo and, let's say, San Francisco. But Waymo, what they actually need is, for low latency to run the network better is servers within San Francisco. So Lektra can literally have because we have a partnership with Sunrun.
Carl (15:31): We could have 20,000 servers just in San Francisco networking with all those Waymos. Hedge fund managers. They're doing high frequency trading, So we can locate servers much closer instead of having their GPUs have to run at some hyperscale location. So these are some of the markets we serve. For media and entertainment, we're working with the movie industry on carbon free movies like, their CGI and their VFX, whatever, so we can make them carbon free.
Carl (15:58): But the reason why we're building in this way as well is because in in edge scale, number one, it's it's good for fine tuning and inference where the market's actually headed. But when you think about this, like, so that the grid, think about that it's already constrained for AI. Now you're gonna add in robotics, automation, advanced manufacturing. All of these markets, where are you gonna get the power from? You know, like, I've asked this.
Carl (16:23): So I've asked this to a lot of people, Department of Energy, Department of War, like, up and down, Pentagon, ask these people like, okay. So, you know, because in 2021, ERCOT, the Texas grid almost went down completely in 2021. It's four minutes from complete failure, and I don't know how many people know that. But yeah. But so okay.
Carl (16:40): That's one thing, an attack or, like, some sort of catastrophic event. But marginally, when the grid is at capacity, what are we building? What's the response? Zero. No one has ex the national labs, ERBA, you know, all the government I figured someone would have an architecture, nothing.
Carl (16:59): Silicon Valley?
Unknown Speaker (17:00): That's a little troubling.
Unknown Speaker (17:01): You know what's working?
Unknown Speaker (17:02): Yeah. Yeah. Super troubling. So if do you have a capacity? In other words, if your cloud is working and it's on demand for Waymo or whatever it is, is there enough power for you to from whoever's given it to you to satisfy that need?
Carl (17:22): Oh, yeah. We we have like say the the thing about our platform is then there's unlimited power, an unlimited power base, to do this from. Let me see if I can actually do this.
Unknown Speaker (17:33): Should be able to. I don't know if we flip that switch, but you should be able to.
Unknown Speaker (17:37): Let's see. View presentation. You're still there?
Unknown Speaker (17:40): I'm still here. I don't see it. I I really have. Yeah. I actually don't know how to
Unknown Speaker (17:47): Oh, there we go. One sec.
Unknown Speaker (17:48): Oh, sure. Down here. Yep.
Carl (17:50): Okay. So now you're seeing this is Lectra's cloud. So you come here, and let's say you wanna generate images. So that's what I was saying. It's like okay.
Carl (18:02): We'll do pen as in space. So you can run, like, 600 different models or plus on our on our platform. So I'm a developer. I come to lectures web business based DreamShaper is just a fun a fun thing to model. Now down here, these are the different servers that execute on on this job.
Carl (18:24): This server, our partners, I'll just tell you, is in Wisconsin. G 4, this one's in Sulphur Springs, Texas. So these servers, these different servers executed this job. And so, you know, what we're embedding is that these fuel types, this is all solar by the way, and I know it is. So the UX will eventually say that.
Carl (18:43): Then you can actually click in and it tells you how much wattage was used and the time it took. Then you can click into each one of those servers and you can see the historic power at that site location. And then if in the future, if I was like, oh, I'm Waymo and I need just servers in San Diego, we have a map so you could just cluster the servers you need in a location, and those servers would help you work on that site location. So this is no one else in the world has this. This is like cloud of the future, and then it's it's infinitely expandable for power.
Carl (19:17): So anyone, the energy host, and I'll just show you a quick thing on our presentation. Also easier. Let's see this. So this is the design of the this is the business model. So basically, you have, you know, the for a home, you'd have a smaller server.
Carl (19:34): And then like for businesses, you'd have like a larger server. And then those feed up to the cloud. And then you have cloud customers just like I was doing there who work on the cloud. And then the money flows down to the energy hosts. And so you can have an infinite amount of energy folks supplying power to our cloud.
Carl (19:54): And our cloud and, you know, our cloud doesn't have to be the cloud either. We can run an API to other clouds, AWS or whomever. So we we're really just AI infrastructure being built out, over times.
Unknown Speaker (20:07): So, Carl, this is, like, totally unknown. Right? Meaning, I'm sure you've done some work to socialize what you're doing. Yeah, but when you do when you type the prompt in was it was using the compute to get a result. But it's not an LLM on its own, it's just compute.
Carl (20:29): Right. So the most of the market that that's what I showed you was inference. Like, people who already have a model of trained models or whatever and and are doing that. I would think training is still better on a hyper skilled hyperscaled data center. You can train on a network, but it might be slower than you would get from a clustered effect.
Carl (20:45): But whatever. So McKinsey is like 80% of the market is going to inference. Once you've built the models, they're kinda they're out there. You know? So where we also shine then is we've partnered with Dell in this way, and I'm speaking at Dell World actually next week, or following me.
Carl (21:03): Businesses have not adopted AI as fast because they don't wanna give over their data to Meta or OpenAI. So what we can actually help them do is then we take the server, it goes to, like, KPMG or something. They can do all their analysis with an open source LLM that's already been made. They can do all their data crunching. Then during the idle time, we can use it for our cloud.
Carl (21:24): And so it'll pay KPMG money in the interim. So it can finance their server, outside of outside of the cloud. So it's actually enabling businesses to then localize their data, localize their energy, you know, kinda helps them be more resilient. Also, think, like, for resilience anyways, I think edge scale is important because so I I actually am part of something called business executives for national security. And I'm I know this guy, Timothy Ray, general Timothy Ray, and he's a five star general ran SEPCOM for years.
Carl (21:56): And I have breakfast with him. This was like a month and a half ago or whatever. And I was like, hey, man, you know, hyperscalers, they're kind of you know, I know they say AI is gonna take over planet earth or it's gonna take over the world, but not if you bomb the data center. They don't go back. You know?
Carl (22:11): This is a huge cap. He's like, well, that's true. Two weeks later, Iran sends bombs to The UAE and bombed two hyperscale data centers. Disrupting Is that crazy? So, like, you think about the architecture of what we're trying to do here and we're building hyperscale, which is critical infrastructure on top of critical infrastructure, the grid, which has a ton of metaware, vulnerabilities.
Carl (22:33): It's I mean, it's like, what are we doing? So I think edge scale even from a business resilience redundancy point of view is a smart way to go. You can also build these things to hide. You can build decoys. You can build, you know, bit to the hole in the wall onto a cave.
Carl (22:46): I mean, know, whatever you wanna do. No one will get locked on where it is. Oh, rethinking capabilities. And then if you think about a third world or even second world, like, think about we're working with a team CMI in energy in Guatemala. They don't have a grid system to even put in a hyperscaler.
Carl (23:03): Right. So then you so what are they
Unknown Speaker (23:05): the natural solution anyway.
Carl (23:07): Yeah. So it's, there's a lot of reasons why you should build, you know, an edge scale. I'm not just pitching this from like, because I'm have a self interest in it. I'm just thinking about engineering the architecture of it and what where I'm actually headed with this ultimately is building sort of a grid 1.5 to two point o. The grid for capacity needs to be able to expand.
Carl (23:27): Now grid two point o, just to define these, grid one point o is centralized grid system. We all know that. Grid two point o would be a decentralized grid system, meaning it's totally decentralized. Power's coming from all different places, etcetera. But grid 1.5 is the connective tissue between the two.
Carl (23:41): And what I mean by that is you have batteries on the grid that can decide whether or not they use the excess storage for the grid operator so they can balance the grid or whether it's used for compute to generate revenue. And then it can also act as a switch for DERs or distributed energy resources. So it's storing and acting as like a buffer between those two things. And then ultimately, grid two point o is there as it's growing. You can start building microgrids, you know, and where this is where the LLMs do come in.
Carl (24:09): Writing energy LLMs to forecast where we actually need to build power, microgrids, etcetera, as resilience, redundancy, and profit from things like compute through robotics, automation, through all this other stuff, all these other markets that are supposed to come, we are building out that grid. So just like fiber optics had to with the internet, very similar, very similar architecture.
Unknown Speaker (24:29): So to be very pedestrian about it. I've got a solar roof, I've got excess power, maybe I don't, if I'm selling it back, I'm paying for my electric in my house, I'm selling it back to the grid. Everything's cool. I'm making some money. So it's a push.
Unknown Speaker (24:45): Right? And I'm making a little bit of money. Is your solution a better solution to put the box here and make more money?
Carl (24:55): Yeah. At the end of the day, yeah. I mean, it's it's an alternative to selling back to the grid, ultimately. We're just a competitive market to, the current grid. You want us to keep it very simple.
Unknown Speaker (25:07): And what's the I know this is a new company and all that business, but is the CapEx to buy the box two years to pay it back, three years to pay it back? What's that
Unknown Speaker (25:15): like? Yeah. I mean, I can share my
Unknown Speaker (25:16): screen again if you want.
Carl (25:17): So just to give you an idea.
Unknown Speaker (25:19): Of course, at scale, Home Depot would be a different, you know, calculation. Right?
Carl (25:24): Yeah. So these are like the unit economics, for example. Yeah. So this is like basically NVIDIA chips. So when we call it home edge versus business edge, home edge is really a $50.90 card that is that goes into your what looks like a desktop computer.
Carl (25:40): And so but with Dell, you know, you can do zero money down. You can lease. You could do, you know, whatever. And then this is roughly the cost, the payback period, monthly revenue, the fiber cost, the Internet cost, the energy costs. So we we kinda price all of that in.
Carl (25:58): And then that's your return. But this is like on the very low end. What we're doing is just pricing GPU, wholesale pricing. What it doesn't include is actually as we build more product onto our cloud. Think like Google has YouTube that sits on their cloud and they make tons of revenue from YouTube, for example.
Carl (26:19): So we would share that revenue with our energy host. So as we build more edge products, as we build more things, the energy host payback period gets faster and faster.
Unknown Speaker (26:27): So one of the things as I understand it, and I could be wrong, is that the biggest problem with the LLMs and the data centers is that they need consistent power. And when there's a surge, they have to turn on the gas turbines and pay more per kilowatt hour than they otherwise would. And so it's just a cost of power and they're still only charging you $20 a month or whatever you're paying for your LLM, but they have to, their, their profit margins are consumed simply because they have to pay more deliver you a consistent power. Is this a solution for that as well?
Carl (27:09): Yes. I mean, it's well for hyperscalers, it is, it is quite a bit different. I mean, they have to have like massive backup reserves. They also, their, their draw changes so fast, so quick. And they have to a lot of things with with hyperscales, they also have to overbuild for capacity.
Carl (27:26): So they can't utilize actually all the GPUs they install in general. I think they get 65 to 70% utilization, sometimes less because they don't wanna build it so the Internet goes out. I mean, we haven't had an outage in a long time with Internet, unless it's like a disruption in the cables or something. But, like, never because the hyperscale data center is fully utilized, a 100% utilized. Similar to the grid.
Carl (27:49): There's 40% margin on on the grid. And I think, you know, for the hyperscaler, they'll swap energy sources just to maintain consistent power or also not to fry the circuitry of the GPUs themselves. They use different battery layers to help slow down the frequency that would kinda fry the, all the switches and all the internal server, equipment. So they actually have a whole bunch of buffers and those alternative energy sources to protect against that. But for us, I mean, it's it's just slightly different.
Carl (28:24): I mean, we have what we call an orchestration layer and you saw a little bit of it in when I showed you our cloud. So those four different layers, like, if if a power site goes down, in our case, we just don't use that GPU at that site location. So let's say
Unknown Speaker (28:39): that switch it out.
Carl (28:40): We just switch. Yeah. The orchestration layer sends jobs to different places. And then that's it.
Unknown Speaker (28:45): So where are you now? You're selling the boxes, you're selling to companies, you know?
Carl (28:51): Yeah. So we we just built, actually, we're launching our first mothership, what we call mothership in New Jersey. Probably in the sea
Unknown Speaker (29:00): something about that. Yeah.
Carl (29:02): Yeah. So I'd I've posted in some pictures and probably have a party and stuff. So we yeah. We sell these to what we call energy hosts to the market. And then we build motherships in different places.
Carl (29:14): So that that one's in Kearny, New Jersey. It's like three miles from, New York. So eventually, we'll have a bunch of satellites around here, and that that will serve the New York Metropolitan Area, Newark, so we can coordinate Waymo's. We can do all that sort of good stuff. But then we're building in Texas, California, and Colorado, these mother ships to network with the satellites that we have there.
Carl (29:35): And that'll just the reason mostly why we're doing it is, well, we get a 100% of revenue, which is, you know, also good to stabilize our business. Primarily, it's to keep
Unknown Speaker (29:44): The company store. Right?
Carl (29:46): But it's also to keep a 100% uptime to our, developers. So I can't in an SLA agreement with them, I have to be able to guarantee three nines or five nines. And in order to do that, I have to have my you know, I wanna be able to do that to guarantee their uptime. So, because satellites, you know, especially homes. Homes, we are what we call community cloud.
Carl (30:08): And then like more like a Home Depot, we can actually build it to an actual data center spec, which is SOC two type two compliant. So it's more secure, for the developers. They're likely less likely to go down because they have more infrastructure, etcetera. But a home could be, you know, cleaning lady could just unplug the computer.
Unknown Speaker (30:27): Right. Right.
Carl (30:27): Right. We're now. So yeah. So for resilience purposes, we're building our own, infrastructure. But, you know, it's it's an insane IRR.
Carl (30:35): I mean, so even with just Kearney, we're revenue positive as a business day one. And then as we're building each one of our mother ships generates about 10 to 15,000,000 of revenue. And so it's just, you know, our next job then is just build build build. I'm I'm building, raising another $20,000,000 to do these, you know, four different site locations, and then I'm setting up electric infrastructure, which will basically build out, you know, 50 of these. I'll I'll raise about 200,000,000 in equity, 600,000,000 in debt.
Carl (31:04): And then fingers crossed, the Department of Energy had give us about a billion dollars to then really massively scale things out. So we're starting to talk about bigger numbers here, almost $2,000,000,000. And then you realize the whole industry is spending 750,000,000,000 this year, and we're like, you know, okay. It's a it's a crazy market. But here's the why we're competitive.
Carl (31:24): It takes us two to three months to build a site. It takes hyperscalers right now because they lack power twenty four months. So our speed to market is really fast. We actually even have cloud companies coming to us saying, can you bill out at your power site location? So, no, that's great.
Carl (31:39): I mean, I think business will be great for us over the next two, three years.
Unknown Speaker (31:42): So what's the moat? Is it your patent?
Carl (31:45): Yeah. Yeah. I mean, and I have some more that are coming, but it basically the core patent, the distributed energy powering GPS, TPS, DPS 16 other processing cards. No one was thinking about this. I I mean, I wrote those in 2023 and got granted in 2024 because we knew grid constraints were coming.
Carl (32:03): We didn't know that AI was gonna be like this crazy. Crazy. But we knew that we needed to lock it down. So, yeah, that's a moat. And, you know, you see some other companies now trying to get in on the edge compute.
Carl (32:16): Even Nvidia is like, they've gone with a company like Span or they're working with like, Comcast to think about how can we get chips out to the edge and utilize the grid a little differently. But they're still optimizing for grid one point o. They have not made the leap off of to distributed energy. And to me, that's insane because how do you expand our grid? I've looked at how the government would do it, like a government spend on trying to build more wires and it's impossible.
Carl (32:43): Even the lead times on, you know, sort of the transformers that are required for substations, whatever, the, lead times are six to eight years. Yeah. So you couldn't build faster. Yeah. You can build faster.
Unknown Speaker (32:54): Yeah. Was a there was a guy that did a documentary called grid down that we had on the program.
Unknown Speaker (32:59): Yeah. David Tice.
Unknown Speaker (33:01): Yeah. David Tice. Yeah. That guy. And it's scary.
Unknown Speaker (33:06): I mean, we all knew it was scary, but
Carl (33:08): Yeah, you won't sleep if you spend any time with Dave Tyson.
Unknown Speaker (33:11): No, no.
Unknown Speaker (33:12): I went by the way, I went to the premiere of that movie. I met Dennis Quaid. I went
Unknown Speaker (33:16): Dennis Quaid. Yeah.
Carl (33:18): Yeah. Skinnier than I thought he'd be. But but, yeah, I mean, the the threats to the grid are material, but, you know, those are like those are real, but I also think about it in terms of, like, how would Grid one point o fail just on its own without an attack, without these outside black swan events, whatever. When is the marginal cost of the production and capacity burnout? You know?
Carl (33:39): Where is that line? And so that's actually, I've been doing a bunch of tests trying to figure that that part out. And then because like, we're entrepreneurs, we wanna we don't wanna we we wanna identify problems, but we also wanna have an idea of how to solve them.
Unknown Speaker (33:52): Totally. Yeah.
Carl (33:53): And that's that's where you can create the, you know, the data center as a flexible load to the grid. We're actually in an RFI with Con Edison right now in Zone J. When you so battery prod batteries are the key, I think, to stabilizing grid one point o and creating storage, you know, whatever. But a lot of the returns are like eight to 10%. So infrastructure guys are like, you know, it's
Unknown Speaker (34:13): Not very interesting. Yeah.
Carl (34:14): Yeah. So but when you pair it with our data center, with a with an edge scale data center, returns get up to, like, 25, 30%. So that pencils. So what we're trying to do at least in the state of New York is attach some compute revenue to these batteries, and then we can put in things like flow batteries, like we're working on a project with Teraflow, vanadium batteries, long duration flow batteries, ten to twelve hour, long term duration, type battery. So, you know, environmentally friendly.
Carl (34:42): It's more of a super capacitor. Those are the types of innovations that I think need to happen in in also this sort of grid 1.5 period. How do we then expand the capacity of the grid, you know, and start then working in sort of the GERs into the model. So so yeah, that and we're actually building some software called Electric Connect, which is a platform that then allows anyone with any energy assets to connect to it to then use that energy for an off take of any type. So then it's like compute is the first step.
Carl (35:14): But then, yeah, as you move through time, it's automation, it's manufacturing, it's, you know we did an experiment actually over Christmas. It was really funny. We took, the power at my head of clouds home location. And then it was about a dollar 50. We had a three d printer, about about 50 per material, to build snowball launchers, and then we sold them to the kids for, like, $20.
Carl (35:35): So that was like a pretty decent margin, but it started getting us to think like, okay. So for supply chain and three d printing, anyone with any energy capacity and manufacture could actually produce these materials and sell them at location for plumbers, for fittings, for all these different use cases, you get rid of all that supply chain. You know? So so once you disrupt the energy market and and move energy to where it production is, it's really interesting what you can start doing. So
Unknown Speaker (36:05): yeah. So what's what's the play for somebody that has an interest in deploying capital first and second sees what's happening? I haven't seen the details, but there's pushbacks on all these big data centers. Right?
Unknown Speaker (36:20): Tremendous. Whatever
Unknown Speaker (36:21): you, the regulatory pushback, consumer pushback, whatever it is, right? Political pushback. What's aside from like, you don't need anybody to invest in your mothership because you don't want them to, I imagine, because it's pure profit for you. But at investing in the company and expression of that?
Carl (36:40): Yeah. Yeah. So, I mean, so far the capital we've raised has been from the industry. Like EG4 is a battery maker. We've had family offices, but a lot of them come from like the solar business or from, you know?
Carl (36:51): So we've we've been really authentic too. We've actually had we've turned away some VC capital, which was from, like, firms that are not aligned with this mission because we would have been a division of them. It's a fortune 500 oil and gas company, by the way. It's sort like, no. Not that I don't even mind oil and gas.
Carl (37:06): It's just that I don't wanna lose control of of where we're trying to go. No. We are raising money from equity owners for for Elektra. We're just launching a $20,000,000 note here in the near term to anyone who's interested. But then it turns really much into an infrastructure play with the Mothership.
Carl (37:24): So it'd be like infra investors, asset backed lender types, you know, that sort of investor, I think, would get more And then with those revenues, we'll probably go for an a round sometime in 2027. And then we could do a really competitive price round with VCs and with, you know, different and I am gonna target corporate VCs because, like, Comcast Ventures because they have distribution through telcos, the constellation ventures, different distribution channels. So that's the idea. I like corporate venture companies because they're a little bit more roll up their sleeves, get into the business, and they they wanna not just profit from you. They wanna actually distribute for you and and figure out a reason
Unknown Speaker (38:01): for business develop you.
Carl (38:04): And they give you new ideas on on product ideas. So, yeah. So we're gonna run that probably, you know, the $20,000,000 note we're starting now and probably hopefully finish it before the end of summer. You know,
Unknown Speaker (38:15): but So what could go wrong if Jensen comes out with a new chip that doesn't consume so much power? Look, I'm not I don't understand all that stuff, so I don't even know if that's possible or would it matter? What else could go wrong for you, for your company?
Carl (38:36): All sorts of things, I guess probably execution risk. You know, I think that's probably our biggest risk. No. I would say I would say the primary risk, honestly, is like, when you're building edge scale, the only other people who've built edge scale, is a company called Akamai. Maybe you guys have heard of Akamai.
Unknown Speaker (38:55): Remember Akamai? Yeah.
Unknown Speaker (38:56): They were
Unknown Speaker (38:56): out for a long time. Yep.
Carl (38:57): Yeah. So they they have about 4,000 site locations already, but they're not powered by distributed energy and some are co located with hyperscale data centers. But, you know, so so they have to run a distributed network. Now for us, we're running okay. So like I said, AWS is a 130,000 site locations.
Carl (39:15): Our deal with Sunrun, we could get to a million homes. That's a lot of places, you know. And so the initial sale, great. Fine. What happens in three years if the things break down?
Carl (39:25): You know? Am I gonna get a million phone calls from, you know, a million unhappy customer? And then what are the costs of replacing those parts and components, you know, etcetera? And you don't wanna call geek squad every time for this. So
Unknown Speaker (39:36): Yeah. You know. Yeah.
Carl (39:37): So we've worked out with the vendors though, our OEMs like Dell and, you know, Penguin and Super Micro and all these warranties to cover the hardware side of the business. And then we can do, like, patches. Like, we're we're in charge of the software side of the business, so we can upgrade and do do all those things. Price of chips, we've thought about. Like, so the price of chips come out, maybe some prices the market conditions, degradate those chips, some rise and fall.
Carl (40:02): With the industrial site locations, we actually tell them to sort of build small to begin with so that they can build into the capacity so they can adapt with the layers of chips that are coming out. So the next layer of chips that are coming out that Jensen announced are the Ferroubens, but those and those should be out in the next six months. So there will be a surge in demand for those and maybe the prices fall a little bit. But it's still a huge question mark on the usefulness of these chips. Because like what we've seen in the power crunch so far is that a hundreds, which are relatively old, even a one hundreds, started going up again because people need them.
Carl (40:37): They're scarce. And also they've built a software environment around those chips so that they continue need to continually need to use them. So actually Jensen's running into this thing where we thought every cycle all these other chips would just go down and a new chip would go up. And we're like, oh my god. But then he's also killing his own business margins by having those drops.
Carl (40:55): So but actually the market's responding in a much different way than we all kind of thought. So forecasting chip costs are interesting. There's actually actually, I think Larry Fink just announced, like, he's buying so much compute that he needs to fig figure out a futures market to hedge his compute prices. There is a company that I met called the Compute Exchange, two guys at Stanford University. They made the Compute Exchange, but it's not quite a futures market yet.
Carl (41:20): It's an auctions market. But it's mostly it feels like it's mostly for big hedgers like a like a hyperscaler. But what we want ultimately, I guess what the market would want would be this ability to buy a futures contract or forward on my $50.90 that I bought from Dell. But maybe we get there.
Unknown Speaker (41:37): I don't know.
Carl (41:38): Then the but but I I swear I think we are on the right track. I think, you know, figuring out this distributed power play is inevitable anyways because I think the grid is at its peak and because no one is working on this, it really carves out a lane for our company to work towards. And if all we do is get the ball rolling like Friendster for Facebook, then I think we've sort of won into society. You know, part of me is like, I'm 45. I'm not that old, but it's you start building companies that I want to have meaning.
Carl (42:12): It's not just about making the money. It's about solving a real world problem for
Unknown Speaker (42:17): This is a real world problem. That's for sure.
Carl (42:19): And solving it for people at, a decent level. Like, you know, all these homeowners, like, we're gonna help out down in Tennessee, low income county. There's a school. They have solar on the roof. It's a republican district, by the way.
Carl (42:31): It's like, you know, whatever. Working with a history teacher to put solar, you know, utilize solar, put an edge scale machine. The next generation of kids can, like, start really understanding, energy, start building, you know, things from it. I think and and that's an important point. I think our generation has lost our understanding of energy.
Carl (42:49): Like, if I asked my grandparents how many kilowatt hours of electricity they'd use last month, it's like, you know, a hundred years ago, we would know that we'd need, like, two cords of wood to get through the next, three months. We had an attachment to our energy. But since the great came out, we've kind of disconnected. And so Yeah. The only way we know how energy works is by charging our phone and being like, ah, it's gonna run out in eight hours.
Carl (43:10): You know, whatever. We have that attachment. But so by bringing in this, like all of a sudden people will start realizing, oh, energy can be an asset. Wait, I can make money out of my solar roof. Like, how else can I oh, I can make it through compute, but what else can I do?
Carl (43:24): And so training the next generation of kids all throughout the country, low income, you know, wherever it is, you know, there's a there's hope that you could actually generate energy as an asset class. And now there's like sort of all this unlocked power that comes from it. A lot of income that can come from it. So anyways.
Unknown Speaker (43:39): That might be the universal income play, right? Which there's all kinds of issues around that. But putting up a couple couple solar panels, if you've got a big field and aside from the, you know, how ugly that is, you know, in in Utah somewhere and put a bunch of your boxes in it and instead of having hay for the horses.
Carl (44:06): Yeah. Well, I get annoyed too. I mean, I I worked for the longest time, actually, even with a wind company I invested into and, trying to make this stuff look pleasant.
Unknown Speaker (44:15): Yeah. It's hard.
Carl (44:17): But but anyways, so our innovation, it's all distributed energy. It doesn't have to just be renewables. So in the future, like, I've met with the Trump administration. I went to one of their events down in DC. He created something called the National Energy Dominance Council.
Carl (44:32): I don't know if you know this team, but it consists of, like, Peter Lake, who was appointed by governor Abbott to repair the air con grid. So there's guys like him, Jared Egan on it. And, you know, the whole event was about SMRs. And so Trump is cutting a whole bunch of red tape to see this happen fast. Rolls Royce is in it.
Carl (44:48): Westinghouse. There's there's kind of a race. Although I talked to, like, KKR, and and he's like, we haven't invested in anything yet. They're usually five years ahead of where they would think something happened. So it might be a ways out.
Carl (44:59): We're not we're not trying to discriminate between, you know, kind of fuel, like fuel cells or hydro or, you know, natural gas potentially. Our bigger concern is for the grid, and sort of sought really making sure we crack that problem and bringing everyone into the energy economy, because it's been honestly like my own opinion, it's kind of stupid our energy policy, swinging back and forth like this. Like, I've learned in Norway, the oil and gas guys can sit right next to the renewable guys and consider it like it doesn't really matter. They're either electrons or photons. No one really cares.
Carl (45:32): It's just powering our society. And so having a better energy policy where it's not political would allow us to actually grow our energy economy so much larger. And then if you're gonna compete against the Chinese, I think we gotta stop the horse race. We gotta just say, look. Alright.
Carl (45:48): What's cheapest? Let people build what they need to build. Let's subsidize it all if we have to, if we're gonna try and defeat these, you know, and I think what's happening.
Unknown Speaker (45:58): Isn't it true that most of the transformers are built in China?
Carl (46:03): Yeah. A lot of them. Also, HEICO, Siemens and, you know yeah. It's true. Look.
Carl (46:11): We outsourced, but like but I mean, I don't know if we're naive or whatever, but in the nineties, we totally thought this neo policy was like a good idea of neo trade and neo liberalism. It's it was a good idea if if our friends over there across the seas respected it, but they stole a bunch of IP. They abused the, you know, trust. Their way of life. It is their way of life.
Unknown Speaker (46:35): Yeah.
Unknown Speaker (46:35): Yeah. It's, it's, it's a completely different.
Carl (46:38): And, and by the way, I totally respect it. Like that's being a capitalist to the tee. Right? And we shouldn't have been so naive to think, oh, yeah, our partners are just gonna do but, you know, that's on us. But I think now that we know it, of course, everyone kinda knows that the cat's out of the bag is like, how do we then build the next generation where it's not so dependent on that supply chain?
Carl (46:56): And I think it starts with an energy first policy. How do we because, you know, and and innovate the crap out of energy and energy systems and new grid systems and
Unknown Speaker (47:04): Everything comes from there.
Carl (47:05): You know what? It's just like We gotta leverage, American innovation. Like, the one thing I know AI is kind of diminishing or saying human capital, whatever. No. Actually, you know, in talking to Jensen, he was saying we need butts in seats.
Carl (47:18): Meaning, the computer needs someone to prompt it. You need a lot of smart people asking the right questions
Unknown Speaker (47:24): in order
Carl (47:24): to get results.
Unknown Speaker (47:25): Then making sure there's quality control on the result, which is can only be done by humans as far as I can tell.
Unknown Speaker (47:34): Oh, totally.
Unknown Speaker (47:35): So we're involved in a six part series that Jensen was interviewed for hours and participated in, all about AI. We're rolling out the episode in the UN has a AI division and we're doing something in Switzerland in a couple months.
Unknown Speaker (47:56): Hey, cool.
Unknown Speaker (47:57): And yeah, Jensen really opened up his door and let us interview him. I say, yes, a friend of mine, and he's a filmmaker. That's his business. And so he's got this six part series talking about rather than it being doom and gloom, all the good stuff that could come from AI. Right.
Unknown Speaker (48:19): So we'll see how that rolls out in a couple months. I've seen the eclipse and some of the previews. So it's really interesting. It's well done. He's a, you know, he's a really good filmmaker.
Unknown Speaker (48:29): He's an awesome speaker too.
Unknown Speaker (48:31): Yeah, he's a great guy. So today they announced that, I think Apple just threw in with Intel to build a new chip.
Unknown Speaker (48:40): Okay.
Unknown Speaker (48:41): Something like that. So, you know, things are happening, right? It doesn't mean that it's going to be better than anybody else. It's just going to be chip capacity, I think. And then what do you think?
Unknown Speaker (48:54): Like what for the people that are listening, you know, I'm sure many of them have solar, which is probably the most popular thing to have on your home. Right? You're not doing most anything else. You're not doing wind necessarily. Does it make sense for them to ring you up and say, let's put a box here?
Carl (49:14): Yeah. 100%. Yeah. We're open. I mean, we'll be launching a new website.
Carl (49:20): But, you know, we're getting better at it too. We're streamlining the customer experience and, you know, trying to be like a real company. We we had to we've had to do a lot of things. We had to marry an energy market, which is complicated, like reduce it down to its easiest parts, and then an AI world and reduce it to its simplest parts to make a device that's easy enough for someone to just get in a box and install. So there's a lot of thinking that's been going on and continues to go on to evolve this product over time.
Carl (49:51): You will get an app that basically tells you, okay, this is how much energy you have. Here's how much you can store it, and here's how much you can sell it for. So it starts gamifying, you know, your energy. And so it does start we have games we're building and it's like how you can build next to your neighbor and sort of how do they build and add more capacity at their home location. That's why we're paying them 80% because it's the hope that they actually put more solar.
Carl (50:16): They add batteries and they build this resilience. They're actually building grid two point zero for us because they're making more money out of it. Yeah. And that's the idea a lot.
Unknown Speaker (50:26): Do they need to do anything? Like, they've already got the solar configuration. They plug in the box. Do they need to have an electrician come out and do anything?
Carl (50:33): Nope. Nope. Yeah. Plugs into the wall. Yeah.
Carl (50:37): For homes, it's it's really simple. For, like, your industrial guys, we actually have to run a sizing and understand the site location, understand its fiber, understand a few other things and then, and then we can put in orders, competitive bids and then ship it. That's a little bit more involved, but, you know, whatever. It's not that complicated.
Unknown Speaker (50:56): Yeah. That complicated. Yeah.
Carl (50:59): Then the other thing just to just to say this too, like in terms of, you know, just to dovetail on what you're saying about Jensen and the UN and how you can use AI. So we've got one customer that's an energy host down in Pennsylvania. They built 700 kilowatts of power on their chicken farm. They do not need 700 kilowatts for their chicken farm. That's way too much for the chickens.
Carl (51:20): And, so we're building a server in a container. But what he does run is, caterpillar, drum caterpillars on his on his site. What we can actually do then is use Lectra, use his machine to automate parts of his farm. And so what AI can ultimately do so our starting point with AI is to understanding energy requirements for how much power do I need at this location. And then the AI can actually run a system.
Carl (51:48): Like, when you think about robotics, Elon's obviously building this robot to come into your house like the Jetsons. But a robot doesn't really need to look like a human. A robot could be the brain is here, and I have an arm as a lawnmower, an arm as a car. You know? That ecosystem could be a lot of different things.
Carl (52:04): So our starting point is understand the energy requirements and start building that capacity. And then how much AI do I need at my location? And can I build businesses? Can I build my farms better? Can I make my, IoT systems work better in my bank, in my, you know, whatever?
Carl (52:20): And that's how you localize AI instead of trying to get it from chat to t p top down where these guys are making hand over fist and it's like five guys who are gonna run the planet. Screw that. People should be able to have their own energy, have their own AI and data at their own location. They should be able to monetize not only their energy, but their data too. And so that's been a big push, for Elektra.
Carl (52:41): It's really important to us to make sure that homeowners actually have the rights. You know? These are the property rights. These are something that we should control, not give up because, oh, China's gonna come over the mountains and and kill us all. No.
Carl (52:54): I think we are an advanced civilization because we since 1620 with Adam Smith, we started coming up with property rights. Those still matter. And for energy rights as property and AI, if we create a system that we're at the center, where people are at the center, then AI will really thrive. Because then it'll be more competitive. You'll have more entrepreneurs and you'll feed a system instead of a top down thing.
Carl (53:18): So that's my Thomas Jefferson for the day.
Unknown Speaker (53:19): Yeah. Yeah. No. That's absolutely right. Totally agree with you.
Unknown Speaker (53:23): And it's important to say that because it's kinda like having mineral rights. They're yours. Right? And you can monetize them if you want. So what about the idea?
Unknown Speaker (53:33): So to me to oversimplify it, what you've done is decentralized, but more to me, the real edge of edge is that you don't need to be selling it back onto the grid, you just put it into your cloud and you're done. That's it.
Unknown Speaker (53:50): Yeah. That's it. So it's just a different off taker. Yeah. Yeah.
Carl (53:54): It's crazy. I mean, we've been so trained to the grid that it takes people like when I talk to people about Lectra, they get it. And then I'm just like, alright. Wait for it.
Unknown Speaker (54:03): They immediately go back to the grid. Right?
Unknown Speaker (54:05): Yeah. Yeah. Yeah. But then, like, weeks later, they'll come back and be like, wait a second. So you don't have to sell to the grid?
Carl (54:11): No. Like, it takes a while to decouple from the grid mentally. And then once you get it, you just your mind is open and you're like, oh my god, I could do so much more with my energy. Why was I not doing that before? Well, because a monopoly tells you that you can't.
Carl (54:24): And then that's, that's powerful, you know. And then and then there was no real competitive alternative, I suppose, for the longest time. I mean, you know, we built around this grid and there was no problems with it and whatever.
Unknown Speaker (54:36): So I don't know if this is a real business model, but someone came to me and said, look, rather than building these big data centers as an alternative method to make money and also power things that we need, big power companies on location, they have excess power all the time because they have to deliver 95% all the time. Right?
Carl (55:00): Right.
Unknown Speaker (55:01): And so they're sitting on excess power regularly that just gets dissipated or whatever the proper term is. Right? What about putting one of your edges right next to there, taking their excess power when they don't need it, and then pushing it up to your cloud?
Carl (55:17): We could do that. Yeah. We do it. Yeah, there's so many businesses like that, that they just have, you know, they've overbuilt the power requirements for something like that. We know some plastics manufacturers, similar case.
Carl (55:31): Yeah. There's there's a lot of stranded energy in a lot of locations, and it's perfect to build towards that. And then and then our energy LLM also then tell it to, like, build additional power sources if they have fiber. We can start working with batteries at the site location. It'll start forecasting what it needs to do.
Carl (55:47): It could build a microgrid. It can build, like, share it with their neighbors, you know, and and that sort of thing. Yeah. So and then ultimately building a decentralized architecture is kind of interesting because we don't know exactly how it's gonna go. So we I've actually started working on a video game to help us game out how people would actually behave.
Carl (56:04): Almost like a digital twin before we build out the centralized grid.
Unknown Speaker (56:08): How cool is that? Yeah. That would make sense.
Unknown Speaker (56:11): We'll come out of that soon. But I I I Scenario planning. What's that?
Unknown Speaker (56:15): Scenario planning. Yeah. Exactly. I like it. Well, Carl, thanks for doing this really interesting and bleeding edge stuff.
Carl (56:25): No. Thank you for having me. I mean, it's, thank you for putting me on the platform. I'm just trying to get, know, sort of the word out of how this works and, you know, lecture, we're we're a platform. We're open to doing business.
Carl (56:35): We hope we're trying to push the market in the right way. You know, I think if we can get people to start thinking about all these different ways they can use their energy, then we've done our job, you know. So, yeah, no, I I appreciate it.
Unknown Speaker (56:47): This, I have one being the capitalist that I am question. So if I went to all my neighbors and said, Sunhomes or whoever you have to deal with is going to put a solar roof, or I go to Tesla and they say, I'm going to finance putting it on all your roofs. I'm going to put an edge compute in the middle of the neighborhood and you don't have to worry about financing it. We'll share the revenue with you and you guys will be powered independent plus make money. Is that something that's doable?
Carl (57:19): Oh, yeah. I mean, we're partners with, solar distributors all over the country, so it's not just something around it. Yeah. A lot of solar companies have come out of the woodworks. The solar
Unknown Speaker (57:28): Does the math work?
Carl (57:30): Yeah. What we do is we just cut them in for a slice, you know, like a distributor. We give them some of the revenue. Again, it's an eighty twenty split, so we give them some of the ongoing revenue. And what I'm trying to do with solar distributors is disrupt their model too, because this whole 30% upfront commission and like, know, solar owners get screwed.
Carl (57:48): I always thought it was gonna be like a bunch of hippies being like, hey, man, you know, it's go green. It's not. It's more like a mortgage broker. So twenty year crappy lease to get $50 off your bill. Sales guy takes 30% upfront and then your wife is yelling at you because they look stupid on your on the top of your house.
Carl (58:07): You know? It's like lose on all levels. So I'd like to change that around and actually incentivize the sales guy to not charge as much upfront, bring the upfront costs way down because if they're gonna get a trailing commission on the on the revenues, that's a totally different model. And then then they can sell that homeowner more product. They can say, oh, actually, let's build up the battery a little bit more.
Carl (58:27): We'll get more juice so you can make more money and be more of like an energy adviser, almost like a financial adviser, but an energy adviser to homeowners and help them understand what they can do over time. So but, yeah, you could call them up. There's a lot of solar distributors have started hearing about us, but we're actually really gonna start, like, with our bigger marketing push. We've been kind of not in stealth mode, but, you know, we got the patents. We were slowly building this over time and now it's just, we have to answer the timing of the market.
Carl (58:55): Now we're just like, let's get the money and let's go. Because this is the, this is the time.
Unknown Speaker (59:00): Totally makes sense. Yeah. All right. Well, thanks for entertaining that question. And again, thanks for doing this, Carl.
Unknown Speaker (59:06): Really appreciate it.
Carl (59:07): Yeah. Just reach out to me anytime. If any of your family offices have questions, you know, and just want to understand it better, I'm always open to a call or whatever and instructional, you know, whatever. I just do this all day long and I love it.
Unknown Speaker (59:18): Yeah. Appreciate it. Cool. Okay. Thanks very much.
Unknown Speaker (59:22): Thanks everybody for being here.
Unknown Speaker (59:23): Thanks.

CEO
Karl Andersen is the Founder and CEO of Lektra, a pioneering energy and cloud technology company building a new class of infrastructure known as EdgeScale data centers—distributed, locally powered micro data centers that connect energy directly to AI compute.
His work sits at the intersection of energy, artificial intelligence, and infrastructure. Karl is focused on solving one of the defining challenges of the AI era: the global shortage of power for compute. Through Lektra, he is turning distributed energy—solar, batteries, and behind-the-meter assets—into high-value cloud infrastructure, enabling homes and businesses to participate directly in the AI economy. His model returns up to 80% of compute revenue back to energy owners, fundamentally shifting value away from centralized utilities and hyperscale data centers.
Prior to founding Lektra, Karl built his career across private equity, reinsurance, and financial services, with a focus on identifying large macro shifts and translating them into scalable investment platforms. He is also the Founder and CEO of Greenlight Asset Management, where he has led investments in climate and sustainability technologies across Europe.
Karl is widely recognized for advancing the concept of “Grid 2.0”—a decentralized, resilient energy system designed to support the exponential growth of AI while reducing strain on traditional grid infrastructure. His work challenges the conventional model of centralized data centers by proposing a more distributed, locally owned alternative that improves energy resilience, ec…Read More





