AI Moves Off the Cloud, Google Breaks the Internet, Google-Wiz Deal Under Fire
Hello. Welcome to another episode of Cloud Unplugged. We have the usual, stories, obviously quite a lot around AI. But to give an overview, we have, some investment by Qualcomm, chip manufacturer in a company called Context, and basically trying to get, like, chip ready, AI models and agents running on your machines, without needing the cloud. We have Google Cloud outage on the June 12 that affected, Shopify and Cloudflare and, I'm sure, many others.
Speaker 1:We have the DOJ scrutinizing the acquisition by Google of Wiz, which is a 32,000,000,000 acquisition, I think, around March ish time. And we also have the YC Combinator event, which is the AI startup school where Andrei Kaparthy and others were given a talk around how to think about AI and the new software paradigm, which they're calling software three point o. So how are you, Louis? How have you been? We had a little gap, didn't we?
Speaker 1:Good.
Speaker 2:Yeah. Yeah. We had a gap. You were on holiday?
Speaker 1:I was on holiday. Yeah.
Speaker 2:Yeah. I'm I'm good, though. Yeah. I didn't go on holiday, but did go to the beach and did you miss me? I'm in the sea.
Speaker 1:Yeah. We will edit that. So, okay. So it's not obviously. Yeah.
Speaker 1:Yeah. We'll we'll make it more convincing.
Speaker 2:Yeah. How was the South Of France? Was it the South
Speaker 1:Of France? Where did you go? It was, West Central France France. Sorry. And I drove, as I mentioned.
Speaker 1:And, obviously, it was electric car. And, I mean, we talked about this before, but, like, the amount of charging is it an issue for me? Like, charging don't mind, but the charging points, the fast charging points were, like, off the motorways that the the Teslery ones. So you kind of just lose quite a lot of time. Probably would never do it again, for just too much stopping needed.
Speaker 1:But, yeah, it was good. No one was really there as it was a very, vacant place. So you drive to, you know, you do your research. It'd be like some beautiful place. You'd be like, oh my God, we have to go there.
Speaker 1:Look at it. It's stunning. It's like a big fort and beautiful river and trees and plants and, you know, and then you kinda get there, and there won't be a single soul, and everywhere's closed. And then there'll be, like, one tiny restaurant open outside that every like, the the 10 people that are there are in, and that'll be
Speaker 2:a celebration or or a reason?
Speaker 1:Just a just a a holiday. I'm I'm I'm always wanted to Party. Yeah. I love France, and I want to buy in France at some point. And I did put an offer in a place in France many years ago, and got gazumped, just before I set up Europe.
Speaker 1:But, so I thought it's always been a big thing for me to do. But, yeah, I, I go there often. I do really enjoy it. I quite like the culture. I quite like the protectionism of their culture.
Speaker 1:You know, you don't have, like, 8,000,000 Pret a Manges and Eats and Subways and McDonald's, proliferated across all the streets. So it's all homogenized. So they don't really allow for that so much. You're a bit more controlled and into their food and into their, you know, vegetation as well. They all understand about plants and plant names and things like that.
Speaker 1:Just have more awareness. So, anyway, I'm kind of a big fan of, like, yeah, the aspect of it. So, yeah, I love it. I do love France. But just I don't know what happened to everybody and where they all were, but no one was in France.
Speaker 1:So it appears or wherever I was, I think we were the only people.
Speaker 2:I let them know on social media. Did you?
Speaker 1:I bloody knew it. I knew it. I could see the zoom in away as I was entering bags, bags being thrown into cars and everyone would quickly go in windows shutting. Yeah. Yeah.
Speaker 1:So that does start to, All the shutters were just Yeah. Oh, John's coming. Exactly. People running.
Speaker 2:But do do you feel re revitalized and,
Speaker 1:The UK? Not with the big drive. And then, as I was saying, I was then went to Brighton to see some friends that I hadn't actually seen. Really, I'm really bad, but I hadn't seen them for, like, nine years. And they're like, we hung out all the time for, like, ten years.
Speaker 1:And then just somehow, I just drifted probably because of the company and things. Not not a good thing. And another friend of mine, set it all up, bumped into them, like, oh my god. We just need to so actually, that was really good. But, after a bazillion hour drive in France the next morning, I had to get up and then, go to, Brighton.
Speaker 1:That was beautiful as well. It was great. But, I think I just rubbed myself down a bit. But, yeah, no. It was it was I don't feel so refreshed, but I really enjoyed the trip.
Speaker 1:What'd you get up to?
Speaker 2:I, on Saturday, it just felt so oppressive, the temperature in London. So we took advantage of doing not a whole lot, which was, which was very nice. It's very good. And then Sunday, we drove down to Deal, seaside down, and, swam in the sea. Well, I swam in the sea.
Speaker 2:No one else did, but it was it was hot. It was nice. Got a little bit sunburned, But that's just on me. Yeah. No.
Speaker 2:It was very pleasant. Very nice drive. Very, chilled out with friends. It was good.
Speaker 1:Do you do a big swim? Where you go I didn't
Speaker 2:do a big swim. More just a light, a refreshed You
Speaker 1:weren't still in the channel. I'll see you in Canada.
Speaker 2:I was going to visit you in France, but then I thought, yeah, it's me back already. I, you know, I thought I saw you the channel this time, just this time.
Speaker 1:Yeah, I did. I did. I thought I could see you, in the vast emptiness of France. I could see ahead in the channel, the English channel. Cool.
Speaker 1:Anyway, getting onto the news. What do you think about this Qualcomm backed startup called, Context, who was started by, Hugging Face, I believe, and FAIR, who I don't know FAIR so much, but, obviously, Hugging Face that hosts all the different models and things like that. So they've been developing this rag age these rag agents, and they've got a company called Context, which is this that's set up, which I'll come on to what RAG is in case people don't know. But, what are your thoughts on basically a a chip native way of working with AI? Good move.
Speaker 2:Yeah. I I think, there is given the, the nature of AIs, which we'll come to a bit later, the, the fact is, you know, running a model, it's, tendency to hallucinate the edges and need very specific context and hand holding to be in the right place. You've gotta kind of rethink the operating system or or the software to make sure the AIs can use it the most effectively, but also have access to your data as you or on behalf of you. And running it locally obviously means you can constrain the ask of of where of what data and how how to ferry that context around much more, efficiently. And Snapdragon, processes made by Qualcomm fits with their strategy of having more reason to have those that hardware on, edge devices, laptops, and, tablets, etcetera.
Speaker 2:So it it kind of meshes with their strategy. I think it seems like even if you're not running all the foundational model locally, having management of context run by, a model, is seeming to be, you know, a thing that that is efficient. You know, you don't wanna send everything back and forth, and you don't want the user to have to manage that context window, and allow the allow them to go off the rails. So having, a degree of intelligence at the edge does make a lot of sense.
Speaker 1:Yeah. I think they, they have their own model called the grounded language model. And I think they then make it obviously compiled to, like, working on the NPU. So NPU compatible binaries, essentially, that kind of compiles into so you can kind of run it. But I think it was, like, a 1,800,000,000 parameter model or something that they've managed to run.
Speaker 1:I think they use the PHY one as well from, Microsoft. Mhmm.
Speaker 2:And manage tiny, tiny foundational model, but it is
Speaker 1:a foundational model. Yeah. Exactly. But they they use they're they're kind of known for, retrieval augmented, what's it called? Retrieval augmented generation, isn't it?
Speaker 1:RAG. So when you're basically asking for something, it will then go off and actually get some extra information about what it is you're asking for from, whatever storage it might be using. So it could be some knowledge base or a database or whatever, whatever it's going to use to kind of fetch extra context, essentially, to kind of add more context, in addition to what it is you're asking for to get a more accurate answer. And then go and obviously supply that to the model and then obviously give that that's the augmentation part. And then obviously then you get a more efficient and effective answer, hopefully by it adding context on your behalf.
Speaker 1:So that's, that's what the rag stuff does. You probably hear rag anyway, being mentioned in the AI world quite often, but that's basically what it essentially means. Yeah. I RAG's quite quite an old concept and it's evolving rapidly into just general context management.
Speaker 2:But yeah. Yeah. Yeah.
Speaker 1:Yeah. Just in case people don't know, because obviously you could think it's red, amber, green. Like, who knows? We have reg reports. So, yeah.
Speaker 1:So, that's kind of what it does. And they're building agents. I think their vision is to help, with, like, the day to day stuff. So a bit kind of like, you know, like Microsoft Office suite, you know, have, like, Excel and your word docs and presentations. So I think I think they're thinking more along those lines of this, like, very localized way to produce things, emails or reports or decks or
Speaker 2:And do you do you agree that that seems, sensible? Well, the only thing
Speaker 1:I can think because it I I guess, obviously, it's so early. What they will become and what they're thinking about being are gonna be very two different things as always, especially as the world moves on around them. They're obviously gonna pivot about no doubt. But given Microsoft is gonna probably embed loads of AI like things into Office and probably and they also have email, and Teams and chat things that lots of companies are using, then them being able to obviously compile things that are kind of native and, you know, do those things is just seems like a natural thing that they will do anyway, same as probably Google might, as well on their devices. So I'm not really sure on the opportunity there, but Yeah.
Speaker 1:Yeah. Yeah.
Speaker 2:It it it is weird because there's, latency for, managing local data, like voice and multimodal data if you're conversing or typing locally. But the fact is quite a lot of software now runs in the cloud, as a SAS and you're not installing locally. So it's in question, you know, where where's the where's the AI? The, the data's gotta come from the cloud to local, then be parsed, and then go back up to the cloud. It's all, you know, there's lots of movement, but they're not they're not alone in the strategy, you know, Google's Gemma models and, working on devices, and, you know, the AI engine built into Cursor for autocomplete does run entirely locally, and it is designed to, you know, Yeah.
Speaker 1:I know what you mean. It's a it's a strange thing, isn't it? So you're kind of right there because a lot of emails are already in the cloud. It's not like you're, I mean, you're downloading them from the cloud locally. It's not the.
Speaker 2:Absolutely. Just to render on the screen and the actual, all the processing and the context is already in the cloud. So it depends on the software, you know? Yeah. Software where you the the paradigm is to clone a git repo, for example, have it all local, and then work on it locally, and then push it back after you finish.
Speaker 2:Software, where you're editing movies and film and pictures. All of that generally has large datasets and is local, but yeah. And and could definitely benefit from this type
Speaker 1:of Yeah. They were talking about having, like, a swarm of agents that all do all these different, very specific things. So I don't know if they're gonna like and, obviously, it's coming from Hugging Face, which is marketplace esque of all of these models. So whether they've got something in mind that's kind of like an agent style marketplace of all these like,
Speaker 2:well, what you, what you do is release the swarms on your machine to use up your Snapdragon processors. So you want to buy extra Snapdragons loads of them. Just
Speaker 1:loads of Snapdragon.
Speaker 2:And then when you shut your laptop screen,
Speaker 1:they all stop. Yeah. So it's basically
Speaker 2:a return to, to that sort of way of working, which is,
Speaker 1:yeah, I guess as a security benefit, at least, you know, if it's not in the cloud, whatever it is you are doing, if it is quite sensitive, say whatever it is you produce, you might not want that necessarily in the cover. I just feel like those days are a bit, you know, who are those people really?
Speaker 2:It's I think it's, you know, there's not a one size fits all here. Yeah. Having local having o local chips, but the the reality of having to work with the Internet
Speaker 1:one way or another is probably gonna stay real as well. Talking about the old interwebs. Oh, yes. The Google, cloud outage, which happened on the June 12, which I think I was away for. I was I was just whizzing across a motorway on the June 12, bundled into a Eurotunnel and whisked onto the other side.
Speaker 1:He And that wasn't down? Chris. That was all working. That wasn't That was all working. The baguettes are still available.
Speaker 1:The the fromage was still there. So everything seemed unaffected from my purview. But, yeah, apparently affected. From what I understand, they made a change to the quota. So, basically, they made an amendment, which is basically essentially a misconfiguration, to the back end system that obviously for for quotas in general.
Speaker 1:And, obviously, there's, like, quotas being like API thresholds or how many queries, you know, you you can kind of make and things like that. So they amended this quota. I think they leveraged Spanner, which is obviously a globally distributed thing, and that quota pushed out everywhere. And some of the quotas set on specific things were at zero. And in other places, they were just very low.
Speaker 1:So as then people were trying to use the services like service discovery or authentication. They were basically rate limited off. So basically just saying, sorry, no, the limit, too many limits, like back off and retry almost later type thing. They're just causing everything to go down. So even like things you know, VMs couldn't authenticate.
Speaker 1:So other services in the cloud actually started to break.
Speaker 2:But from a high level, I think the the, the impact on, users was SaaS services built on Google CALD and therefore dependent on these services were were were affected. There was some big players. I knew it. Was it Shopify? Was it?
Speaker 2:Yeah. Shopify. They were down. Cloudflare isn't a user service, but OAuth is a technical standard. Technical stuff.
Speaker 2:Yeah. They're they're they're Google, the go Google authentication service. In the cloud
Speaker 1:was down,
Speaker 2:which is It affected,
Speaker 1:it affected drive. I think it affected Gmail. It affected the, I think it affected Google services as well. It wasn't just like, that's what I'm saying. So it wasn't just It's hard.
Speaker 2:It's hard to tell as a user with Google services when it's because of an outage or because of their great user interface design, because sometimes that just goes snap.
Speaker 1:It's a little There was a little dig there.
Speaker 2:It's a little Yeah. It's like, sorry.
Speaker 1:Yeah. No, no, I think it was quite bad. I know that Cloudflare, that was obviously got, was, was a big outage because they depend on. They're not hosted in Google, but they integrate into some services, I think, for telemetry and for authentication. Cloudfare is very,
Speaker 2:it's not really a user facing service, but just to explain to their listeners, it's, it provides like the front door to, many websites, and a distributed network for Yeah. Content and, security across the Internet. So many services are behind CloudFlare services. And if CloudFlare is then dependent on Google bits that go down, that can affect a hell of a lot of services.
Speaker 1:Yeah, exactly. Yeah. So, yeah, so that was pretty, pretty epic, but you probably wondered Louis. Well, is this going to happen again? Does anyone care?
Speaker 1:Are we all wondering that? So, it
Speaker 2:does highlight that the interconnected dependent risks of, like, higher, higher stakes when, less than a fewer, fewer companies control more and more of the Internet and there's less competition, which will come on to in another story. But, you know, that that it does highlight that, having massive solutions at scale, is is very hard. And, you know, less competition in cloud is a risk. So I don't know. I personally, I feel at some point, the Internet would get back to, you know, I triple e sort of, RFCs, request for comments and and a standard bodies for the whole Internet.
Speaker 2:You know, the Internet is built on protocols. It allows all machines from any manufacturer to all talk from any geopolitical background. They all just work. And we need that for compute. I think we need, you know, we shouldn't just have big cloud bodies.
Speaker 2:We should have distributed peer to peer compute, but that's, that's my little pipe dream and idealism sort of injected in the world. And there are standards like Wasm that, you know, promote the, holistic, standard for compute, but, you know, whether it actually come in, it's a different point. It's a
Speaker 1:different point. Like, says, you know, I don't know why I find it quite amusing, but whenever people just say to you know, the Internet and the Internet and the Internet, it kind of reminds me of, like, my gran or someone, you know, whether, like, everything's the Internet. They'll be on their phone, but, oh, the Internet's down or whatever.
Speaker 2:The internet is down and it's in the cloud.
Speaker 1:Yeah. So the things that way you're like, a cloud a little bit triggering that you were like, yeah, it's the internet.
Speaker 2:Yeah. That's where the internet is up there.
Speaker 1:Well, just to give you some reassurance, they are, doing break circuits. So they've said, so basically, essentially, if things do fail, they fail positively. I. If it if there is some rate limit for authentication, it will just remain authenticated and let you through as opposed to like blocking you or stopping things. So rather than default being the negative bit, which is like, nope, sorry, no access denied.
Speaker 1:It's actually a a positive one instead if for some reason it's kind of and I think they're moving to, like, regional quorums, not necessarily these global distributed things with Spanish. So I think they're, like, trying to work out a better approach so that you don't have such a catastrophic outage, and marginalizing it. So, you know,
Speaker 2:I, and Google are, are, you know, probably one of the best,
Speaker 1:I'd never ever implemented that store as I was like, I was over the other day. Now I'm quite good
Speaker 2:at software engineering. So, you know, there there's a chance that they can solve all that for sure.
Speaker 1:Well, talking, staying on the old Googs, they did an acquisition of Wiz. Now Wiz is a cloud security, company that's basically got a a SaaS product that will be able to, like, scan multi cloud. So I think it's Amazon, Azure, Google, but they have to do on prem and they probably do as well. I'm sure they expanded into those areas. And, it's like an agentless style approach.
Speaker 2:They do. They're multi cloud. This is part of the, the
Speaker 1:They are multi cloud.
Speaker 2:Because they are all the clouds and the API led, they have tight integrations with all the services across all the cloud. So Google acquiring them means, oh, are they gonna carry on doing that? Exactly. And if no market leader Yeah.
Speaker 1:Exactly right. Yeah. So the DOJ, is scrutinizing whether this is basically antitrust, as in, like, you know, can we trust that they're gonna do the right thing? They bought a company called, Mandiant in 2022, which is basically like a cyber threat intelligence intelligence company. So they already did one acquisition and then they're buying Wiz.
Speaker 1:Now the issue with Wiz is it is one of the most successful and most used cloud security products. You know, they their market share was insane. Their growth exponentially, I think, was the fastest growth, of any startup bar. Now obviously OpenAI and other things that have obviously come out, but, their growth rate was insane.
Speaker 2:That market specifically.
Speaker 1:In that market specifically. Yeah. Yeah. So but I think I think overall for a startup back then, it was like it was it was radical. This is pre the AI movement almost.
Speaker 1:Mhmm. So, yeah, they, like, did a phenomenal job as a company, but like you're saying, they obviously are multi clouded people are using them for many things, not just for Google. And I think there's a risk that they will bundle things up maybe, you know, to kind of overly incentivize using Google. They might start integrating WYSI bits. That's obviously a term.
Speaker 1:That's a term I've come up with. It's not, you know, that's not a term Wiz uses, but the wizzy bits might start to be integrated into, Google. You know, they might start integrating Wiz into their own ecosystem underneath the hub.
Speaker 2:And then It does, it does give a good nod to the technology, that Wiz have apparently they have a graph based storage, to model, the entities in cloud and, and to work out the relationships. I don't know if, how much AI, haven't really looked at my product in-depth, but, having the nod from Google certainly shows that the tech's impressive.
Speaker 1:Oh, yeah. I mean, some of our staff went to, WIS. They, early days. Yeah. So so, but, yeah, they they did they did, to be fair, like, great.
Speaker 2:Yeah. So do you so so because because they are a leader and they have good tech, do you think that there's, a lot in this DOJ k DOJ case? And do you think there's a risk to the market and the trust, or do you think it's just, you know, or do do you think other issues, are more important, like actually just clouds getting too big and only too many services.
Speaker 1:I mean, it seems, you know, I don't, I guess it does to, to, to me, I think it is correct to scrutinize, like every company that becomes successful, you can't just buy until there's nothing left. Do you know what I mean? I mean, like, there has to be a point of, like, protection of competition. You need competition. Competition drives the right innovation.
Speaker 1:You can't, you know, we can't eradicate like any competition or, and because otherwise you're just like, what's the, what's the point? I can't do anything. Anything I go to try and do as a company they've done, or they're going to buy somewhat and then integrate it. And then I'm out of business. You know, all these companies then just can't survive, because they've got such insane growth.
Speaker 1:So I do think it is right, but I mean, they're not on their own. I feel they've been a bit marginalized. Obviously, this is compared to, say, Microsoft that has does get away with quite a lot, I think, in terms of their reach. And you know, Amazon kind of similar ish. They've got VCs, they're backing other startups.
Speaker 1:It's not just that, right? They're investing.
Speaker 2:Nvidia buying every, every other company. There's not a lot.
Speaker 1:Nvidia doing there's nothing, nothing left.
Speaker 2:It's just
Speaker 1:going to be owned by a lot of consolidation.
Speaker 2:It's interesting. Google are not the, the market leaders in cloud, poor little Google, but there's so much consolidation. It's like really, like, is that an issue?
Speaker 1:But anyway Yeah. Yeah. So I don't know. I think they were talking about breaking it up, breaking Google up as well. There was conversations about that, you know, especially because I think there was also that debate around Chrome being pulled apart and actually not part of Google because obviously other devices use Chrome and it's, and so then to make it independent and
Speaker 2:You think these companies are, like almost too big to fail and too big to break up now. And are they have they got more influence, in authorities in the world than, you know, is reasonable for the regulators to behave to influence
Speaker 1:or Yeah. I think it's the new oil and gas. Yeah. Isn't it in some ways? It feels a bit like it.
Speaker 2:Yeah. I mean, like my, my, my wish for like a more distributed world, like with open standards, you know, feels like we need to get back to, but you know, you're right. They're like, they're the new moguls, the new massive orgs.
Speaker 1:Yeah. And I think, I think, you know, we've had a lot more when it's pharmaceutical organization. It's the same. It's the same with many, in many sectors. Right?
Speaker 1:They're not on their own. You've got pharmaceuticals, Pfizer, and other large bodies that dominate and the same as you've got in oil and gas, the same as you're gonna have in many segments. But I think because it's kind of cool and very b to c and they provide, like, lots of cool features, innovative features, you know, you've also got, like, the Facebooks, of the world as well buying lots of other companies up and integrating, you know, the WhatsApp purchase and Instagram. And, so, yeah, I think because it is, then people maybe don't mind so much from a consumer perspective because they're getting quite a lot of value out of them really day to day. So it's a it is a funny one.
Speaker 1:It is. But it is a it is obviously not great that you've got such behemoths Yeah. Owning owning the world so much. But, yeah, anyway, to to not talk too much about that, this Andrei Kaparthy talk at the YC Combinator, software three point o, I don't know a huge about this. So what what was said?
Speaker 1:What's happening?
Speaker 2:I I watched it the weekend, actually, on Saturday where it was too hot to do anything. I actually went indoors while it was so crazy sunny and and watched a YouTube or two. Again, a YouTube, it's like the same as on the Internet. But, and and Jerry Kaparthy, he's, you know, he he's a bigwig. He's one of the the big sort of, illuminize, in the AI space.
Speaker 2:One of he was the director of AI at Tesla in the four former years left at about twenty eighteen, moved over to, OpenAI, had done lots of work, at Stanford in the past, and he's another one that left, left OpenAI with the big shakeup of, Sam Altman. Some of the the the are they open or closed AI? Anyway, he did a talk to the, to YC Combinator to the up and coming community, who would would probably go on to start lots of companies. And he was trying to be inspirational and frame how people, young people coming into industry today should think about software. And he did a talk, a few years back about how software two point o, is different from software one point o.
Speaker 2:So software one point o is software. Software two point o, it was, describing neural nets and how trained, data led, software models can be trained with reinforcement learning to do highly specific things. And the whole Internet, as a result, has changed. You know? Your your app the algorithms, most famously for your feeds and your social media and your, recommendation engines on various platforms is software two point o, but they're highly specific, highly successful as well.
Speaker 2:You know? You keep on devices, and you can see people tripping over their phones everywhere these days because their software is so successful. But there's a new paradigm, software three point o, where you have foundational models, LLMs, and attention transformers that that really scale up and are generalized. They're not specific anymore. They can do almost anything.
Speaker 2:So he was he's trying to frame the fact that we used to have just software one point o, but the the world has changed and there's a lot of software two point o, out there. And and now software three point o is growing rapidly. And what we spoke earlier about how local devices, you know, can run models and actually the operating system and the thing you interact with is a mixture of these different software stacks. It was a it's fascinating talk. I you know, framing
Speaker 1:not not to be, like, too controversial, but, like Mhmm. You know, one point o, two point o, three point o four. Yeah. No. I think.
Speaker 1:Yeah, it's like, I think it's slightly meaningless, I think, but like, what does he actually mean is in what's the, what is it he's saying? Is he saying if you're a new, if you're, if you are the new. You know, he's, he's gone to this like AI startup school, right. Or something. I think that's what it was.
Speaker 1:It's like you got a bunch of people that obviously motivated to create startups. Mhmm. I guess what he what he's what was he really getting at? Is he saying, if you're gonna innovate in this space, you should be innovating, leveraging the more generalist models and think about how these can support your business Oh. And drive innovation.
Speaker 1:Or
Speaker 2:It it was that. But it was how to think about them and what they're good at. So, understanding that SoftwareONE still has value, but it's more at the edges of very specific things where you actually need to store data. It's good to have SoftwareONE point o actually saved to disk because that probably isn't very good with a model just sort of spaffing data. But at the same time, people understanding where what software would generalize models are capable of what they're not and how we should think of them.
Speaker 2:And he was relating them their psychology of a model, their kind of human spirits. He was using what, but there's casket in that they're they're random and there can be absolute geniuses in some domains yet not know how to add and have massive holes in their capabilities, which are not obvious. And so you've got to have them on a rain and be able to move a slider between where there's autonomy and be interactive in the loop so that human if you have software, it needs to be rewritten so that AIs can use it, whether local or or not, wherever it is. The AIs need to be able to interact with the user interface of software and be a collaborator, and you need to be able to move that slider to say, AI is you can do this bit. And for certain domains and certain areas like recommendation engines, or whatever, they can completely do it, labeling images or whatever it is.
Speaker 2:But other bits, you really need to be in that loop of verifying and people are in the loop and really successful software. I mean, I'm thinking about cursor, but, you know, the Clippy or, or whatever software needs to really play to this play to that, the advantage of what, an LLM type of operating system of software looks like. It was, it was fascinating and it is, it very much chimed with my experience using software like Cursor and and and how far you move that slider. But being able to move that slider interactively and understand and verify very quickly, he related it to the Ironman suit of the Ironman suit can fly on its own and do certain things, but at the same time, he's in the suit driving it. And I say it's got capabilities, but you need to be able to know where and when to use it.
Speaker 2:But just to
Speaker 1:be in the suit, I mean, or is that just his ego?
Speaker 2:Yeah. You're probably right. Iron man. Is it
Speaker 1:the question? Could this do actually, would it be more effective if he wasn't in the suit? That is the real leading question.
Speaker 2:Yeah. I, I, I find it, I find it a very good, talk.
Speaker 1:It's cool.
Speaker 2:It's well worth, well worth watching. It's worth watching. To
Speaker 1:just listen to it or do you need to watch it? Is it, is it?
Speaker 2:Yeah. Yeah. It's a podcast, but it's probably best to watch just because, I mean, you've got the slide deck. Okay. Well, I've linked you to the slide deck.
Speaker 2:You can, have a look at the slides yourself and listen to
Speaker 1:it and move the slides through the slide. Do you need to see the slides to understand what he's talking?
Speaker 2:That's a good illustration. I'm more of a, a visual person, so it helps me.
Speaker 1:I'll give you a little, I'll try and have a listen first. See if I can make some, if not, I can always have a little whisk through the, the slides. Yeah. But, yeah, it just it does sound very interesting. Gotta be
Speaker 2:Another one one other interesting takeaway was, where he tried to frame where it is now, and where it's going. So he worked at Tesla where they had full self drivers, mostly C plus plus with some software too, or, or, neural nets doing image classification and things. And then slowly over time, that bit of the software, the self driving bit, has completely changed to pure foundational models and lots of different smaller models orchestrating between themselves to do the job. So there's no software one point o in that stack. There is for the entertainment system and the user interface for the user to see what's going on.
Speaker 2:And he sees that trend happening at large, but he sees the parallel where we are now as very much like nineteen sixties micro machine, you know, where where where you you're at a terminal to a supercomputer, and no one had local computers. Whereas today, we we don't really have local models, and they're not fully capable of all the bits. And most software is software one at the moment with bits of software two or bits of neural nets out there. But we're seeing a dramatic shift right now with loads of companies we've already talked about today, delving into that local models, user interfaces with users in the loop, and that slow, software, as our LMS and foundation models take over lots of existing software interactions. So fascinating times.
Speaker 2:It's good to have it to be done and framed well.
Speaker 1:Yeah. Who knows? Who knows where it's gonna evolve to? Very, very interesting. But, I'll I'll give it a listen.
Speaker 1:Cool. Well, I shall share those slides. I'll make sure I maybe put them in the brief, actually, when we do the episode Yeah. So people can find them easier, rather than to dig around. But, yeah, that's the end of the episode.
Speaker 1:Hopefully, it was informative, and we should be back next week now I've returned. Although I think I might also be going away again, but I think it's almost Friday. So, the week after that, I might be I've got my sister's thirtieth, but there we go. Cool. Speak to everyone soon.
Speaker 1:Bye.
Creators and Guests

