Nov 4, 2023

26 – Debate on the Role Govt. Should Play in AI Regulation | Venture Markets | Pharmacists Walk-out

Featuring: Vic Gatto & Marcus Whitney

Episode Notes

In this episode, Marcus and Vic engage in a healthy debate surrounding the crucial question of the government’s role in regulating AI as they dive into the complexities of balancing innovation and ethical considerations, exploring the potential risks and benefits associated with various regulatory approaches. They also discuss the latest developments in federal policies impacting the tech industry, analyze the dynamic landscape of venture markets, and examine the unprecedented pharmacist walk-out, shedding light on the underlying factors driving this bold act of protest. Tune in for a captivating exploration of these topics as we navigate the ever-evolving intersection of technology, governance, and societal impact.

Stay Connected

KEEP UP WITH THE LATEST HEALTH:FURTHER EPISODES, NEWS, AND EVENTS!

Watch this Episode on YouTube

Watch, Listen, and Subscribe!

Episode Transcript

Vic: [00:00:00] How are you, Vic? I am good. Had a first time. We’re talking about exits in the portfolio today, which is kind of fun. Finally, in 2023, it’s been a long, long dry spout. It ain’t closed till it’s closed, but just someone wanting to, you know, buy one of our assets is kind of

Marcus: fun. Man, in this economy, uh, that’s, that’s good news.

Yeah. That’s good news for sure. For sure. Uh, well, you know, best of luck on, on getting that across the line. I know you’ve been working on that today, having calls. That’s the fun part. It’s more fun work. It is the fun part. It is the fun part. That’s awesome. Uh, I mean, on my side, you know, things are, things are kind of stabling out after a pretty crazy month.

Um, You know, just kind of getting back to normal work. Um, yeah. So that’s, that’s pretty much all I got. We can jump right into the show.

Vic: Yeah. You’re right. Yeah. I got a lot of feedback from last week. So, uh, you did people listen out there. Yes. It was good. Good. That’s a good thing. All right. Uh, we’ll jump right[00:01:00]

Marcus: in quick, quick note about the fed. Um, they are pausing rate hikes. Uh, so kind of no news is no news. Uh, but. You know, I think that Wall Street immediately starts running and betting that that’s the end. I don’t, I mean, I do understand because they’ve been betting again that it’s the end for the last year and a half.

And so, um, I, I understand it. It has not been a good bet, uh, at all. And I saw a headline where Jamie Dimon said, That he thought the fed was probably not done and that there certainly could be more hikes in the future I think you and I based on what we saw with the gdp with the last uh with the last print um It’s the job’s not done.

You know, the job is not done. The consumer is not weak inflation is not tamped out and until those things really show Some sense of capitulation, you know 4. 9 GDP growth. That’s not [00:02:00] the objective of rate hikes, right?

Vic: Right. Yeah, I mean I think I think Wall Street was happy that the Fed kind of took a pause and they’re going to wait and see how the effects impact the overall economy, which was more positive than them hiking right now, right?

But I listened to the press release last night. Chairman Powell spent a lot of time talking about. You know, there’s risks on both sides. We might have to raise more. We may be finished. He didn’t at all seem like he was completely done. He did sound like he was watching all the different data feeds. The fed has access to, you know, on net on balance is probably good.

Marcus: Yeah, for sure. Yeah. I mean, look, it certainly feels like we’re near done. I mean, you know, I don’t know how much more you can layer on top of what’s already been done over the last year plus. So, uh, it doesn’t feel like we’ve got another two, you know, full percentage points in us. But, you know, do we have an incremental 25 percent basis?

Point hike at [00:03:00] some point in the next, you know, three to six months. I, I feel like, yeah, maybe likely, I think it’s likely. Yeah. Um, but anyway, I don’t think there’s much more to talk about. No, I

Vic: just wanted, it’s good to mention it. Cause it, cause you know, the fed only comes out once in a while.

Marcus: Exactly.

We’re constantly monitoring that. All right. Um, so more, just a quick note on the labor side of things, a little bit nontraditional because this is not a proper union, but, uh, pharmacy staff across multiple, uh, companies, CVS and Walgreens. Uh, started a three day walkout. So, you know, measured in so far as, uh, it’s not an organized true union strike with, you know, clear demands that could be negotiated in an organized way.

Yeah. There’s no one really

Vic: to negotiate

Marcus: with. There’s no one to negotiate with. Correct. But there is a, there’s a clear show of dissatisfaction and resistance, and it continues this narrative of, you know, labor versus employer.

Vic: Yeah. And I think the pharmacy, the business of being a pharmacy. Is not profitable today that the [00:04:00] PBMs and other aspects have sort of hollowed out the dispensing fees and what you do to run a pharmacy and I think the real big complaint is not their own.

The pharmacist are not looking for their own pay to increase, they’re looking for more support. Their assistants and their staff that they used to have, have been hollowed out. They don’t have support anymore, so they just can’t get the job done that they feel like they need to do, which seems like a fair.

A fair request or a fair demand.

Marcus: I mean, anecdotally, when I have gone to my pharmacy, which is, it happens to be a Walgreens pharmacy, um, over the last year or so, I can’t say the staff look particularly happy at any point when I, you know, was in line and, and, uh, picking up my, my medications. And so, you know, I can see where, and there was, there was always a line, they were always doing work.

You know, there wasn’t like, right now. Right. No. So, [00:05:00] um, you know, it’s, it seems like a very busy labor intensive job. And I think having adequate support is probably reasonable, but as we’ve talked about, um, you know, the labor complaints or the labor demands or the labor issues, um, in order for there to be real, um, Real resolution that is beneficial for the labor force.

There has to be a there there in the business model There has to be a there there in the P& L and I think as you correctly noted The the pharmacy P& L has been attacked and has been hollowed out by by other players. And so it’s gonna be interesting to see um You know, look, pharmacies are a key reason why people go to these stores, and I think that’s what the pharmacists understand is that, you know, you can’t totally, um, isolate us in a pharmacy P& L because we drive traffic, right, you know, to the store, people walk in, and then they buy whatever else, so, you know, they’re probably saying, listen, you [00:06:00] might have to rob the retail P& L, uh, to help us more, but we are certainly the golden goose in terms of foot traffic, and I think that that’s, you know, That’s fair, fair play.

Vic: Yeah. I mean, I think that the, this is a continuation of the theme that the way that we staff and deliver and reimburse care is not aligned with where all the value is and where the most people are needed. And pharmacy is not a very profitable sector. And yet it’s a critical thing that we have to have.

And so that needs to be sort of recalculated defragmented, brought together, and then, you know, somehow incorporated into a new business model that can support it.

Marcus: Yeah, I mean, it feels to me like this is just going to be the first of many painful chapters in a story around how pharmacy is going to be transformed, right?

We’ve already been talking about Rite Aid’s flat out going out of business. And so, [00:07:00] you know, you can’t even really talk about The pharmacist in those stores anymore, right? So, so, so that, that doesn’t, to me, that just does not bode that well in terms of negotiating leverage that, that the, the, the pharmacy workforce is going to have here.

Unfortunately, I mean, you know, I empathize greatly with, with the need for more support and, and, uh, tough working environment.

Vic: Yeah. It’s just, it’s just important to note like for non union workers to stage a three day walkout across multiple brands. All kinds of states. There’s a lot of suffering that they are.

It’s not like a, just a one off thing. There’s a, there’s a systematic problem. I don’t know how we solve it, but it’s a, it’s a big issue.

Marcus: Right. So, so more and more pressure, uh, on, on the pharmacy business, uh, shifting to a story really around what’s been happening in the private markets. Um, this is this olive AI story is.

We have not talked about it a whole lot [00:08:00] because we started this show in 2023, uh, 26 episodes and we do it weekly. Um, and all of AI was really a story of 2021. Yeah, it was

Vic: already, it was already on the way to the grave when we started. That’s

Marcus: right. That’s right. And, and in 20. 21 while it was rising, you know, faster than any other digital health business.

You and I would regularly sort of talk about it and, you know, see stats where they said they had 10 percent of all the health systems as customers. And, you know, we, we would ask all our friends who work in health systems here in Nashville, Hey, are you guys working with all of AI? Some of them would say, never heard of it.

Some of them would say no. Um, and so there were always things that felt not quite right about this business, but probably the most important one was. Over the course of, I think, four different raises, but it could be three, three or four different raises in one year. They raised nearly a billion dollars.

And the question is, what the hell were you going to do with a billion dollars raised in a year other than spend it? Like how can you possibly turn that [00:09:00] into, you know, two or 3 billion of revenue by raising that in a year? I mean, what in healthcare works that fast as far as innovation?

Vic: I mean, nothing. It was never it.

The team, starting with the CEO, was really good at raising money and telling a story, but it never really translated to delivering value with health systems as partners. It was just a story that didn’t materialize.

Marcus: And I think it’s also important to remember, you know, because all these things start to blend together.

But. All of AI came out before upgrade in AI, which was generative AI and GPTs and those transformers. So really what they were selling was RPM. They were, they were selling robots. Clicking and pointing and shifting data around on screens. I mean,

Vic: And, and, uh, uh, cute story. I mean, what’s interesting to, [00:10:00] to imagine though, is that they came out with this big story, raised a ton of money, you know, all kinds of, uh, naming their people, all of Indians and making up all kinds of new names.

And they were about two years early. So like, it might have caught the AI wave and really made a difference, but they just were, the timing is everything.

Marcus: Well, I asked you when we were looking at the general catalyst announcement about the merger of Commure and Atlas two weeks ago, two or three weeks ago, I can’t remember exactly when it was, but I asked you, dude, You know, where’s all of, because literally this Atlas company, that’s remote patient monitor also in their portfolio.

Yeah. Not remote patient monitoring, um, uh, revenue cycle, uh, that that’s RCM. Like that sounds a lot like olive and right. And

Vic: they had already not worried about all of they’d move that or moved on.

Marcus: And general catalyst was, was, you know, a really, [00:11:00] really big, there were others. Um, uh, Ascension was in this deal.

Oak was in this deal, but general catalyst was a very, very big investor in, in olive.

Vic: They’re really trying to push this, um, upgrade the back office of healthcare, make a difference. And I’m in favor of that, but they have been on the bleeding edge, maybe in front of what’s possible. And all of certainly was that,

Marcus: yeah, I mean, just the amount raised, I mean, 800 plus million dollars.

And two years later, it’s out of business selling for parts. I, I don’t want to do what we did in the, in the, in the pair of therapeutics, you know, episode where we, you know, fully diagnosed that, that meltdown, but this is clearly a very similar meltdown. Um, You know, these kinds of things are not good for the entire venture space.

When they happen, uh, they just put a bad taste in everybody’s mouth. You know, all the articles start getting sent around. We get our LPs sending us links to this stuff, you know, Hey, are you, are you invested in one of these, you know, kind of ridiculous companies? Um, you [00:12:00] know, the good news for our LPs is we don’t get invited to, we don’t get invited to these crazy billion dollar rounds.

Um, but yeah, I mean, it’s just kinda, it’s just insane. Just insane. There’s

Vic: this constant, um. After it really from Silicon Valley, this, they happened to locate it in Ohio for, cause they got tax benefits and things, but it was a coastal bat company really. And there’s a, there’s a Silicon Valley thing of trying to like just replace healthcare and knock it over and redo the whole thing every few years.

Someone tries to do that. And at least so far it has not been successful.

Marcus: All right. So coming off of the, um, the olive AI story, I mean, I think there is a larger story that we can talk through. About just what’s going on in the private market. So PitchBook has, uh, really done a great job for years now of putting out a quarterly document called the Venture Monitor.

And it’s really sort of what we in the private market [00:13:00] to use both looking at VC and PE to kind of see how is everything going? How, how are you know, how, how’s funding going? How are exits going? What are valuations? How many deals are being done? But now that Carta has been in the market, you know, for three, three, four years now, and really gotten a lion’s share, they

Vic: run the cap tables for a decent percentage of the market.

Marcus: Exactly. And, and when, when these transactions are done, you know, they’re all sort of logged on Carta kind of as a ledger. Um, it really is making them, you know, I think a little bit more of a, uh, like, like the proper record, quite frankly, um, you know, pitch book, you’re, you’re kind of translating what’s happened into pitch book and they’re hounding you to update your records with everything, but Carter’s got the data, like they literally got the, the cap tables, as you said, and, and the securities are all registered, um, on their platform.

It’s also a pretty big walled garden. Like

Vic: I, uh, had a call from a founder today. She, she needed a bunch of data on a potential investor. Pitch [00:14:00] book’s pretty expensive. So like you don’t have, if you’re a founder or even a small VC, it’s expensive to get, like we were able to afford it, but, but Carta can give you now pretty competitive data and they, they open source it.

They, they, as we’re showing here, they publish it.

Marcus: That was a really good point, Vic. Thank you for making that point that I’m just assuming like everyone has access to PitchBook, but no, it’s super expensive. Yeah, it’s hard. So, so anyway, they came up with this report called the state of private markets. Q3, 2023.

I’m actually not sure how, how long they’ve been doing this, but this is my first time actually looking at it. And I think it’s a great report. Um, you know, they show some really great stats. They’re they’re Q3 highlights.

Vic: Great is incomplete.

Marcus: Correct. Yeah, exactly. So I do find that, that they have sort of an interesting way of framing things, right?

So their Q3 highlights are early stage valuations are on the rise. Time between rounds is lengthening and four or nine a valuations are staying put. So that’s kind of the highlights. What I thought was interesting was [00:15:00] they didn’t really juxtapose that against how many companies have died and also the complete.

Like cratering of deal volume that that is happening. And there’s one specific chart here. This, this, uh, the link is going to be in the show. Now there’s a bunch of charts. There’s a bunch of really, really great charts. But, um, the chart that I really liked is the one that, that talks about valuations. But also covers, um, deal count.

So it’s under the heading of fundraising and valuations. And, uh, they’re, they’re making the point that valuations in the seed and series a space have not decreased. They’ve actually gone up from 2021. Right. So, um, that’s, that’s good news in the seed stage. There’s been a 26 percent increase. 0. 5 percent decrease in deal count done between 2023.

And in the series a space, and this is, this is right after Charles Hudson’s article [00:16:00] last week, where he said he thought he felt that like it was 75 percent decrease, 50. 3 percent decrease. Decrease in series a deals from the second quarter of 2021 to the second quarter of 2023 50 percent the number of deals getting done has been cut in half.

Yeah, I don’t care if the valuations are up 7. 1 percent in the series a space when half the deals are getting done.

Vic: Well, obviously, I mean it’s it’s The definition of survivorship bias, right? Like, the valuations are holding constant on the half of the market that is getting funded. Right. Spoils to the winners.

Yes, but all of the ones that are less than the top half are not getting funded. And so their, their valuation is zero. And so it, you know, they didn’t average that in. On on balance. It’s a survivorship bias thing.

Marcus: I mean, we, we are a full year plus from when you and I started [00:17:00] banging the drums about, you know, how bad this was all going to be founders now get it.

It’s, I mean, we’re, we’re seeing it in, in jumpstart foundry applications, right? We’re seeing companies that in 2021 would never have gone for a pre seed round. And now, yes, they’re absolutely going for pre seed rounds, right? So we

Vic: actually called together the portfolio last November. Yeah. Yeah. Yeah. And did like it just with our portfolio companies, not public like this is trying to warn them and they were still, uh, struggling to understand that.

But I think now they understand,

Marcus: yeah, everyone, everyone understands at this point. But I mean, having this data, I think. I think it’s super helpful because it’s from a third party. It’s not us as VCs, you know, being positioned as, Oh, we’re trying to negotiate better positions and lower valuations and all this other kind of stuff.

No, you literally have half the chance you had two years, uh, from two years ago of getting a series a round, right? It’s twice as hard. Yeah. Like, you know how hard twice as hard [00:18:00] is? In anything? I mean, it

Vic: was hard to raise money two years ago. That’s right.

Marcus: And now it’s twice as hard. Like, take whatever is like your max on the bench press, and then double the weight.

That’s how much harder it is. Uh, you know, this is, this is intense. And, and the thing is rates don’t have to go up another 5 percent for this rate to drop another 50%, they just need to stay where they are longer.

Vic: And I mean, I think. I mean, it sounded like Chairman Powell last night was going to hold them this high, maybe a little higher.

Maybe he’ll cut by a quarter point, but they’re not coming down to zero. No, no, no. So the other thing that I’ll, I think this shows is that, I mean, C and A deals, you can’t go too low, right? The, the valuations are. They kind of are what they are. You need to allow the founders to have some ownership, the pre seed and seed investors, they can come down a little bit, [00:19:00] maybe, uh, but they’re basically flat.

But as you go to BCD, the later stages, they have had the same loss of number of deals. So the same survivorship bias, but the valuation has come down dramatically.

Marcus: Yeah. And so it’s just,

Vic: it’s just a pure. It’s just a difference between the later stage and the early stage.

Marcus: Yeah. So just to, to, to put a finer point to what Vic just said in the series B, um, deal count is down 47%.

So pretty much as bad as series A, but the valuations have come down 21 percent in series C, the deal volume has gone down. 53. 8%. So worse than series a, the, the valuations down 37 and a half percent. And in series D the deal count has gone down 63. 9 percent and the valuations have gone down 49%. So the valuations are literally cut in half.

The deal volume is, I mean, I don’t think people understand how much [00:20:00] things have changed. This, this is a dramatic. Difference in our markets. Yeah, like dramatic

Vic: and this is what we were talking about last week the Job of a seed founder or seed vc is to get to cash flow positive and healthy growth Hopefully the healthy growth is still very fast But it’s not growth based on new equity coming in you have to be like living on your own strength Bringing in revenue, bringing in gross margin earnings, and then investing that into growing new, you can’t rely on C D rounds.

They’re just not there. No, no, it’s probably on net healthy, but it’s going to be a really hard transition. There’s a lot of business models, a lot of teams that are not designed for that. And there’s, it’s hard to change your business model. After two years of doing

it,

Marcus: that’s right. And quite frankly, when you look at this, it makes things like, you [00:21:00] know, um, the, the main street health round, even that much more impressive.

Like for that kind of round to get done in this environment is unreal.

Vic: It’s unreal. That was a strategic crowd. I think strategic rounds where it is the strategic value, those payers were investing in main street. Because they want to have access. They want it to exist. They want to learn from that. Those deals will still get done.

The financial rounds are gone.

Marcus: And so, okay. So that’s, that’s actually a really good point. Cause I, I agree. And it’s something I’m constantly telling founders. What does that say for the value of a strategic fund versus a sort of pure financial arbitrage fund, because I think. We always described those as two different animals, but generally I think they had the same business models.

And I think even, you know, the financial one was seen more favorably because it was playing power law games and, you know, was playing, you could play

Vic: momentum games, the drag of having [00:22:00] to deal with these, all the strategies, right? You know,

Marcus: but, but, but in this new world, I mean, it seems to me that the, The roles have reversed here.

And actually the more strategic, the more aligned, the more you really understand the market, the more likely, quite frankly, you’re going to get a hit. You’re going to, you know, it’s, you know, your slugging percentage and your batting average are more likely to be in a place to generate returns for your LPs.

Whereas if you were a financial financial fund, you know, when you’ve got capitulations, like we’re seeing in series B through D, both in deal count and valuation. I mean, what’s the model? What is the model for pure financial VC fund with these results?

Vic: There is no viable model. If the public markets don’t change.

So for a series D fund that is following around other funds and purely financial and hoping to then kind of be in the private company for a couple of years and then go [00:23:00] public, that’s not functional with the public markets where they are today. It just doesn’t work. So you either have to be early where we are, or you have to be strategically valuable and add, add several benefits to your LPs.

There’s a financial gain. Certainly. But there also is market intelligence. It’s almost like an extension of the R& D efforts. And I don’t think, I don’t think the public market multiples are coming back anytime soon. They’re totally delivered. I think they’re delivered with the Fed funds rate. And so again, unless we have some reason to see rates cut to close to zero, you’re IPO markets.

And then there’s not, there’s just not a financial arbitrage like that.

Marcus: I think also referencing the public markets, um, you know, Meta just had an outstanding quarter, just unbelievable turnaround from where they were last year when they were plunging a bunch of [00:24:00] money into the metaverse. Right. Um, a lot of it really because they’ve, they’ve focused on AI.

They fixed their ads. Yeah. Uh, you know, leveraging ai, they fixed their ads, you know, so they beat the Apple privacy thing. Yeah. Which is like un unreal. Okay.

Vic: Um, and Apple’s gonna come out tonight, so we haven’t got, we’re recording this Thursday afternoon. They might be coming out right now with their earnings.

I think Apple’s struggling. They they are, they have China problems. They get all kinds of problems.

Marcus: They are, they are. Um, but I, I think the point I was just gonna make is that even though. We’ve got all these speculative growth oriented companies that are just getting destroyed. The companies that rode that wave to the top of the stock market, to the top 10 in both the NASDAQ and the S and P 500, they’re not going anywhere.

I mean, they’re doubling down. They’re in stronger positions, you know, your alphabets and your medicine, even your apples. I mean, when we say they’re [00:25:00] struggling, I mean, I mean, they’re

Vic: struggling relative to where they have been in the past. They’ll be fine.

Marcus: Your Amazons. And so. Even more power accrues to the incumbent technology companies.

Yes. And that is, that’s actually worse for private market innovation. That’s worse for the venture industry. Um, these companies are more and more dominant. Uh, I mean, when I, when I think about alphabet and I think about Amazon and I think about meta, I do not think about friendly companies to venture firms.

No, they, they, they, they, they hate us. Yeah. I

Vic: mean, they are trying to protect their own. Monopoly their own business position, and they don’t want upstart companies having having away parts of their market share. That’s right. So they’re actively working against that. I think in health care, it’s Much more complicated.

It’s hard to have a mean epic. Maybe has a pretty strong position, but there aren’t that many platforms like that Look,

Marcus: thank God in health care. [00:26:00] Those three companies are not strong players. Yeah.

Vic: Yeah Well, they have tried I think all of them have tried. I don’t know Facebook has tried they’ve all tried various things They have and it’s hard It’s complicated.

You have to actually, you know, eventually you have to interface with a person and make them better. And that’s hard to do in the metaverse.

Marcus: And it’s got all sorts of liabilities and they don’t like liabilities. Yeah. Yeah. They don’t want to be responsible for what they do at all, ever. Yeah. Um, all right.

So I think we’re going to stop there. Let Doug talk about Jumpstart Foundry. When we come back, we’re going to dig into, uh, AI, uh, because it’s been a big, big week, uh, on the AI front for one reason in particular. And Vic and I are going to dig in on that topic.

Doug Edwards: Thanks guys for the opportunity to talk about our pre seed fund, Jumpstart Foundry.

My name is Doug Edwards, CEO of Jumpstart Health Investors, the parent company of Jumpstart Foundry. We’re so excited to be able to talk about, uh, early stage venture investing, certainly the need for us to [00:27:00] change the crazy world of healthcare in the United States. We are spending 20 percent of our GDP north of 4 trillion a year on healthcare with suboptimal outcomes.

Jumpstart Foundry exists to help us find and identify and invest in innovative companies that are going to make a difference in healthcare in our country. Every year, Jumpstart Foundry invests a fund, raises a fund, and deploys that across 30, 40, 50 assets every year. allowing ease of access for our limited partners to invest to help us make something better in health care.

Some of the benefits of Jumpstart Foundry is there’s no management fees. We deploy all the capital that’s raised every year in the fund. We find the best and brightest typically around single digit percentage of companies that apply for funding from Jumpstart and we invest in the most incredible, robust, innovative solutions and founders in the United States.

Over the last nine years, Jumpstart Foundry has invested in nearly [00:28:00] 200 early stage, pre seed stage companies in the country. Through those most innovative solutions, the Jumpstart Foundry invests in, we also provide great returns and a great experience for our limited partners. We partner with AngelList to administer the fund, making that ease of access, not only with low minimums, but the ease of investing in venture much better.

We all know that health care is broken. Everyone deserves better. Come alongside us with Jumpstart Foundry. Invest in making the future of health care better and make something better in health care. Thank you guys. Now back to the show. All right. So

Marcus: Vic and I agree on most things, but this is an area that we are not aligned on.

So don’t want to bury the lead. Don’t want to cliffhange too much, but we’re talking about the government’s role in AI. So this week on October 30th, president Biden [00:29:00] issued an executive order. It’s called the executive order on safe, secure, and trustworthy artificial intelligence. And when you read it, it’s kind of a.

You can read it in two minutes. It’s not very long. Um, the, the, the fact sheet is not very long. The actual order I’m sure is much, much longer, but the fact sheet is not very long. Um, and it’s covering sort of a broad spectrum of areas that AI is going to impact. And it’s giving some level of a directive to the agencies, uh, that are, that are under the, The authority of this administration on how he wants to direct them to engage with the free market with other agencies with the general public as it pertains to AI.

Yeah. Um, and again, this is, this is to make AI safe, secure and trustworthy. So Vic, you took the time to print and read the fact sheet and you wanted to kind of go through some things. Yeah.

Vic: I want to have a couple of things I want to [00:30:00] review, but before we do that. And we will, I’m sure we’ll have a healthy debate, but I want to give you credit.

I mean, you and I have been, I mean, since chat GPT sort of came out and shocked everyone last Thanksgiving, I mean, about a year ago, you and I, and the rest of the world have been trying to process this. And you said from the beginning that it was going to be. Regulated and I did not think it was going to be so you were absolutely right on that front Um, and so I want to give you credit for that.

I didn’t think this was going to happen and You were absolutely correct about that. Thanks now. I I I don’t think you’re right about it. Oh a lot of other stuff But i’ll give you credit on that.

Marcus: Well, why stop being wrong now? So

Vic: there’s lots of aspects to this but I picked out four that I think um, just illustrate my Um, nervousness about it.

So my basic belief is [00:31:00] that AI technology is moving incredibly quickly. And that, well, it might, it certainly is true that. The executive branch of our government is going to attempt to regulate it. I don’t think they’ll be successful with that, and I worry about them trying to regulate it and, and having unintended consequences.

So I just picked out four pieces to sort of talk through to illustrate this, and we can bat it back and forth. But the first one is around their require, it’s around safety, safety and security. And so one of the aspects is to require developers of the most powerful AI systems to share their safety test results and other critical information with the U.

S. government. Okay, fine. In accordance with the Defense Production Act, they require companies developing any foundational model that poses a serious risk to national security or national economic security or national global health, [00:32:00] public health, or safety to notify the federal government. And so my, my question and I have a, that’s fine.

I, I’m in favor of that. Overarching theme, like, sure, if it in, if it endangers the public or our security, the government should be aware of that. That’s fairly obvious. The question I have is, how do we know when a model possesses serious risk? And how do we measure that? How would a company building that know when to report it when not to?

I don’t trust that the government has the capability to measure that. And then of course they don’t say anything in here. And so that’s my, that’s where I get nervous about it.

Marcus: Yeah, I, I think that that nervousness is reasonable. Um, And also I think my fundamental position on this is, uh, the government [00:33:00] is in a situation with no good answers, right?

So, and, and, and every once in a while, science puts government in this kind of predicament, which is to say the government has a responsibility, uh, or at least it is perceived to have a responsibility, um, to protect the citizens of its nation state. And. Sometimes things like nuclear power or bioweapons weapons.

Vic: Certainly. Right.

Marcus: Well, let’s, let’s, let’s not talk about the weapon. Let’s just talk about the, the raw fundamental thing. Right. So, I mean, I just said bioweapons, but really, you know, what I should be talking about is, you know, lab created viruses, right. That, you know, it’s, it’s a weapon when it’s used with, with, with malicious intent on its own, you know, a hammer can build a house or it can, you know, Kill somebody, right?

I mean, so [00:34:00] there are there are times when science presents a new breakthrough that the government is almost certainly going to be, uh, poorly equipped to address it. And yet they can’t do nothing right. And yet they can’t do nothing. And we spend a ton of time talking about all the things that the government does wrong.

We’ll talk a lot about the dysfunction of the house. And we’ll talk a lot about the dysfunction of the Senate more broadly and both parties and, you know, sort of everything pushed into the fringes, totally ignoring. All of the everyday miracles that happen that certainly the government is a big part of.

And one of the things I brought up to you in the conversation leading up to this was, um, the miracle of air travel and how every day we go to the airport, we hop on a plane, we get in these things and we fly around with a bunch of other [00:35:00] people in the air. And every day, these things, the number

Vic: of planes that are landing at the Guardia every minute, incredible.

And taking off and they’re weaving in and out together, and that is the government working very well. Yes.

Marcus: And, and, and so I just, and look, even healthcare, even with the issues we have in healthcare, we’ll, we’ll talk about, you know, um, information mistakes or people dying of sepsis or, you know, whatever. Um, and, and how it’s like a, you know, 747 crashing every day.

We’ve, we’ve heard those kinds of things. And still. It could be so much worse if not for the regulatory apparatus and protections and government oversight and things of that nature. And so artificial intelligence, we know now we didn’t know before when we thought artificial intelligence was Siri, we were like, ah, it’s not that serious, but now that we have generative AI, we, we know we are on the cusp of the, [00:36:00] the technological breakthrough of our lifetimes.

Right. It’s, I think it’s, I think it’s fairly clear. I think it’s fairly clear and it is a tool and it can be used in all sorts of ways. The government has no choice. They absolutely must get in the arena. They must take a position that, that they will do what they can to keep their, the citizens of their nation state safe.

And they have a lot of different. Vectors from which they have to do that. And I think that this fact sheet, what it really does is it just frames up that they’re aware of it, that these are the areas that they think they have to attack, that these are the agencies that they would appropriately assigned to those areas.

And then a general. Statement of, of sentiment and some sense and some statement around who needs to report what they have no standards, they have no capabilities. They, they, they have no fundamental, [00:37:00] you know, um, subject matter expertise in the legislative bodies. Uh, you know, they don’t have the, the structures inside of the agencies, the key agencies.

I would say the only key agency that probably has the appropriate structures is probably the NSA. Okay.

Vic: Yes.

Marcus: You know, and, and maybe the, the c they’re not, they’re not gonna tell you what they’re not gonna tell you. Right, right, right. But like, you know, the F fda a, they don’t have the right structures, you know?

I mean, I was at the Aspen Health event and the guy who was from the f FDA was saying, we’re wrapping AI in our, in, in our medical device. Act like that’s where we’re sticking AI right now, because we don’t have a structure and a place for it. So anyway, right. I, I, I say all that to say they don’t have a choice.

They have to do this because left unchecked. This could, it’s not hard to see how AI plus nuclear power leads to a potential extinction level event. So left unchecked, uh, It could [00:38:00] be catastrophic. The government has to start the process. And what I think is Biden is presenting an executive order because he knows there’s no other place in government that can, you know, we, we can’t do it.

We can’t do it. We just, we just finally got a guy who’s a speaker of the house who literally, when you ask him, like what his view of the world is, he says, go read the Bible. What is the Bible going to tell you about artificial intelligence? I’m not, I’m not, I’m not trying to knock. I’m not going to argue that Congress is functional.

I’m simply saying, if someone says their worldview, if you want to know my worldview, you know, take a look at this, at this Bible, that does not give you confidence as a person who’s governing a nation full of people that have all sorts of faiths, that they’re going to understand all the intricate issues that are going to come up as it pertains to artificial intelligence.

So Biden had to do something, had to do something.

Vic: I [00:39:00] guess he had to do something. I mean, I guess my My view, I’m going to, I think it’s a Reagan quote, the scariest eight words that you can hear is we’re the government and we’re here to help, like, they’re going to screw it up. Like, I think the government is very good at 1 thing.

So let me just finish. I think they are really good at saying you cannot land a plane at LaGuardia unless you are a certified pilot and FAA guidelines. You know, you can’t operate on a heart unless you are a certified heart surgeon. They have The ability to prevent people from doing something altogether, unless they hit a certain level of competency and follow rules.

I think the government has to play that function. I don’t think they have the ability to actually understand where Uh, foundational model is risky [00:40:00] and where it’s not risky, it’s application. I don’t understand that. And I’m much closer to it. You’re, you’re, you’re talking about the intrinsic implementation.

Like I, they’re going to implement a bunch of stuff. And I think they might do more harm than good. Okay,

Marcus: so in the arena of the interstate, we know, yeah, interstate travel, cars, cars that are going, you know, 70, 80 miles an hour. There, we know what kinds of risks could be presented, right? We know what risks could be presented.

So are you saying that they don’t have any reasonable chance of saving lives in establishing some regular regulation around how artificial intelligence is deployed, at least at how it’s safety tested and the rigor of, of what it needs to go through. I just kind of like, I like it. Let’s just take

Vic: that example.

That’s an easy [00:41:00] one to talk about. Yeah. Um, self driving cars certainly can get better, but I think. There are more deaths due to human error than, right now, than there would be if you had self driving cars out there. You don’t know that. I do know that.

Marcus: No, no, no, the scales are not, the scales on which self driving cars have actually been deployed versus human cars are not accurate.

It’s so apples to oranges like you don’t know that you literally you can’t know it. You can’t know it because you don’t have a statistical significant sample of self driving cars. I should have brought

Vic: this thing. I didn’t know we’re going to talk about self driving cars, but four of the top five leading causes of traffic deaths are human error.

Drinking and driving, texting, distracted driver, speeding, and then there’s one that is not human error. And self [00:42:00] driving cars would, would, you wouldn’t have any of those.

Marcus: Okay, let me, let me throw something at you. Okay. Cyber security. Okay. What’s easier to hack? A human brain or a network of cars that receive software updates?

Via the internet network cars is easier to hack. Okay. So how, how have we, how, how much have we shown our ability to actually protect anything from, from cyber threats and cyber attacks?

Vic: Well,

Marcus: no, no, no, no, no, no. And so we have no ability. We have had no ability to prevent that.

Vic: Okay.

Marcus: All right. And, and, and might, might, might a foreign enemy who attacks us via cyber.

Think this is a great way to create wreak havoc. Is that possible? That’s possible. I’m not sure. It’s the [00:43:00] most effective way, but it’s possible. Well, but but it’s possible, right? Sure. Okay. So anyway, I’m just saying there’s so many angles through which this has to be thought through and We need to be really, really careful about making broad assumptions that the government’s not going to figure it out because that’s based on, honestly, it’s based on just ridiculous media narratives and, and just bad sentiment.

And, and, and it doesn’t mean that they will get it a hundred percent, right? They almost certainly will not. They definitely, because they’re humans. And it’s an organization will make mistakes and those mistakes will almost certainly cause loss of life and cause economic destruction. I think there’s no way that they get it a hundred percent, right.

But they can’t not act. That’s the, that’s the, that’s the conundrum that we’re in. Like they can’t not act. There’s too many. [00:44:00] When you, when you really play these things out all the way, because it’s, it’s technology, you know, because it’s technology, there’s too much risk. There’s just too much risk for them not to come in and say, Hey, we need to start really evaluating all of these different scenarios.

And by the way, they know they cannot let America fall behind in the development of artificial intelligence for all the reasons. AI can be incredibly positive for our society. There’s, and we’re going to talk about some of those, right? There’s all these amazing things that AI can do for our society and economically, like it’s going to be the next frontier.

If we’re going to continue to win the war of innovation and I agree with that

Vic: completely,

Marcus: like we can’t, we can’t not do it. We can’t not do it. So you got to start to put regulatory frameworks in place. I just think, I guess so. I mean, I,

Vic: I don’t think that I have formed my opinion from median errors. I think I’ll [00:45:00] form my opinion from.

Interacting with government officials, mostly at the state level. I don’t have a lot of, well, that, that is a different thing. I, my, my experiences, politicians and administrative bureaucracies, the way that they gather information and make decisions is systematically flawed. They, they gather information from groups of people that are trying to bring an agenda forward.

Maybe they can gather from many different groups, but the Most people out in the world don’t take the time to go meet with their politician or whoever. So they get a biased, limited view. And then they try to implement policy that is the least bad, but it doesn’t [00:46:00] change. They don’t have the ability to modify it or change it or innovate, iterate fast enough.

And so I just don’t think they’re designed and we’ll just jump to the fourth one because I think it’s probably the best example. They, um, they’re talking about, um, both health and human services and education, and they kind of grouped it together. And the health and human services, obviously, because of what we do, I was very focused on.

Yep. And it’s just a generic, I don’t even know what it means. Department of HHS will establish safety program to receive reports and act to remedy harms and unsafe healthcare practices involving AI. Like, okay, that they should have done that. Last week, they should do it now. Like that doesn’t mean anything.

And then they talk about the department of education should support schools and educators deploying [00:47:00] AI educational tools, such as tutoring, um, to help our kids. And we already have that is called Khan Academy. It’s freely available globally. They have an AI tutor. It’s been out for eight months. Why does the government have to get involved in that?

There’s a free service already. That’s much better.

Marcus: Yeah. Well, I mean, the government has. Distribution capabilities, the Khan Academy doesn’t necessarily have, there’s a lot of communities that have no idea what Khan Academy is. I mean, I appreciate that you know what it is and I and I know what it is as well, but I actually didn’t know that they had an AI component to it.

I probably would if I had a kid, if you had a kid that needed a tutor, correct? Um, but even still, you know, the government is there, you know, for the least of us, not for. Not for you and I, vendor capital people, right? Um, I just think they’re going to create

Vic: some competitor [00:48:00] to the existing private, but, but it’s, it’s a nonprofit and freely available competitor out there.

Marcus: What, what, what is, what is way more likely to happen at scale? Is they will establish standards for this stuff and they’ll let the free market compete and then different states. I mean, the, you know, the thing about the department of education and HHS is that the reality is that stuff is really played out at the state level, not at the federal level, especially HHS.

They do not have like broad sweeping capabilities federally to do almost anything to do almost anything. We’ll have

Vic: 50 different experiments, right? Right.

Marcus: So, so really what we’re talking about. And that in this specific instance is standards, best practices, um, that will get leveraged and probably will tie to, um, federal funding, right?

Probably will tie to federal funding for both the departments of education and also healthy human services at the [00:49:00] state level. So they can comply. So if they want to receive funding, they’ll have to adhere to certain standards. So that’s most likely how that will actually play out. If I had to get those standards, it’s not going to be a product, man.

I guess it’s my point. Like that’s not actually going to result in a product. It’s going to be a bunch of competing products and they’re going to phone that money down. There’ll be some standards. People will RFP. I mean, because back to your point, let’s, let’s just keep your point center here, right? Which is.

Not a lot of people are actually contributing to how this actually gets worked out, how it gets worked out as lobbyists, which are paid for by, you know, some kind of the private industries. Yeah, right. Yeah. I mean, I mean, who, who’s spending more money on lobbyists and big tech right now?

Vic: Nobody. And it’s regulatory capture.

We’ll put the earliest thing on. I mean, it’s back to that, like they’re going to try to put up regulation to protect their existing monopolies. I, I agree. I agree. So, so as an innovator, as a VC, I don’t like that. [00:50:00] I don’t want them to establish standards in healthcare education because the standards are going to be set and only change every 18 to 24 months.

Yes. And the tech is changing every day. Yes. But I, but I know they need to do something. I think they do something because of, for political reasons. Because it’s no, it’s more than that. I

Marcus: mean, I mean, yes, yes, yes, political reasons are always part of the calculus of, of anything that political people are doing.

I mean, so I, I agree with that, but that’s also very sort of cynical that there’s no actual, uh, you know, altruistic intent here to

Vic: that they are intending to do good. I just don’t think they will do good. You don’t think they will do any good. You think they will only do bad. I think they’ll do more, cause more harm than good.

Okay, that’s, yeah. That’s why I want them to stay the hell out of it. Okay. But I agree with you that they’re not going to stay out of it, obviously. They’re in it.

Marcus: I [00:51:00] mean, but I think where we don’t agree is, like, do you, Do you not think that there is real existential safety risk that they need to lean in on?

And, and so, so here, here’s, you know, I think generally speaking, and I think partially because, uh, Obama used it to get elected, uh, the government was way behind on social media. Yes, way, way behind. And now, now, you know, we have a surgeon general and now we have attorneys general, but it’s too late 10 years ago.

It’s way too late. Yes, it’s way too late. And like social media is having devastating impacts on society. Devastating impacts on people’s mental health, devastating impact on our government economically. It’s destroying industries. It’s destroying the media industry. It’s destroying, you know, the advertising industry.

Um, 3 billion people. You can’t do proper customer service on a platform. It’s got 3 billion people. I [00:52:00] mean, it’s just, it’s an unbelievable, ridiculous monopoly that’s happening in that space. And they got to it. Way too late. Basically, you could argue that just never got to it at all. And so look, I mean, I’m not for that either.

You know, I don’t think the world is better because we’re all connected via Facebook or, you know, WhatsApp. I would love for someone to break that company up and, you know, let other people actually have a shot at doing social media at a minimum, force them to, to create a federated model where people can take their account with them and move to other platforms.

You know, platforms or whatever. So I

Vic: just think that that is a slippery slope and there’s no where to stop it.

Marcus: Maybe, but where we are right now, Meta is a, is a pseudo government. I mean,

Vic: Meta, Google, Apple, they’re all more powerful than any government.

Marcus: Okay. Well, well, that’s powers going to accrue one way or another.

So I don’t have some sort of absolute, [00:53:00] You know, private market is good at all costs. Well, it’s not good at all costs because then it’s, it’s totally focused on the, the private actors it’s, you know, it’s in, in their, in their shareholders, shareholders, interests, and not really what happens in the community, you know?

Like, anyone who says that Meta actually cares about what happens in the community, relative to caring about their quarterly earnings reports, it’s just, it’s just, they’re gathering attention

Vic: and monetizing attention. Yeah. However, however they need to do, however, whatever, whatever they do, it doesn’t matter if they get attention by stirring up.

Controversy and helping people to see how different we are and how much we hate each other or they could get attention through positive way. And it’s a lot easier with negative. It’s a lot easier with negative where they’ve done that. And the algorithms, not even, it’s not even in Meta’s control anymore.

Like the algorithms just are off driving detention.

Marcus: Well, it’s, it’s in their control. They know what the result of the algorithm is. They’ve [00:54:00] set the goals for the algorithms. And, and, and on Bankless, they did a show. You know, talking about this, and I love the way that they, they framed it, you know, dancing for the Algos, right?

I mean, that’s the, the, the long tail of it is they’ve manipulated our behavior. I mean, one thing I’m pretty proud of with this show is we are not dancing for the Algos. I mean, you know, we don’t get a lot of social likes and we don’t get ginned up on social networks. Because we’re going to have this debate or we’re going to talk about this nerdy healthcare shit.

Yeah. I don’t

Vic: know what this is going to do to our audience, but like, I have the opinions I have the opinions you have, we’re just out here. Like figuring out, we’re

Marcus: doing a podcast that gets pushed out on an RSS feed that, you know, you can get with any pod, Cast player. Maybe you’re listening to it on Apple.

You’re probably listening to it on Apple. Um, maybe Spotify or maybe downcast or maybe, you know, whatever podcast player you’re listening to it on, because this is free content and I don’t just mean free in terms of you’re not paying for it. It’s free. You know, we’re not being [00:55:00] manipulated by, by the, by the algorithms, but the vast majority of people in its long form, the vast majority of people who aren’t Focused, you know, first of all, investing what we’ve invested to produce this, you know, they just have a phone.

The phone is their only way that they can actually make media and communicate with the world. They are dancing for the algos, you know, they’re, they’re, they’re, they’re being, they’re being incentivized for distribution and engagement to say more crazy shit, you know, no matter what side you’re on. Right.

Yes. And tell me that didn’t play a role in what’s happening in our House of Representatives right now.

Vic: It’s our entire

Marcus: culture. Dude, it could be the destruction of our democracy. So I’m sorry. Like, we cannot let the private market just do whatever it wants. You know, I mean, it’s, it’s not private market good.

Anything that, that, that attempts to, you know, regulate the private market bad. Like, I just don’t, I just don’t agree. Anyway, we should, we should move on. We’re not going to agree on this. We will move on. I

Vic: think we both agree that AI is an existential risk to humanity. [00:56:00] Where we differ is, is how to protect ourselves.

Marcus: Yes, we differ on that. Okay. So, so we want to just give a couple of, um, related stories to this, right? So, so

Vic: that happened Monday. Yep. And then on. Tuesday, Wednesday, Kamala Harris went to the UK, and it was the G7 countries, and then they had China, there were a bunch of countries there, which was great, and they’re trying to, um, lead the global countries.

In some version of AI regulation,

Marcus: I think it’s funny that you think when all the countries come together, it’s okay But the United States government doing it. No, no, I still think

Vic: they’re gonna

Marcus: it’s

Vic: no

Marcus: different. Okay

Vic: You just said it was great. Well, it’s great that It’s great what I what I was meaning it’s great that they didn’t say g7 only oh and they invited China, okay, because I think if you don’t invite china, I don’t I didn’t see india there [00:57:00] Hopefully india was invited, but I think what was great is they invited Many people, I don’t know how the invitations went, but it wasn’t only like the UK and the U.

S. That’s what that’s what I got. Great. I don’t think adding in a bunch of other politicians is not going to make it better. So I don’t think the product of this summit, the A. I. Safety summit globally is going to be any better than the U. S. Safety summit. The U. S. Is trying to take a leadership role, which is Positive to include everyone.

Marcus: Uh, the, the, the headline from the wall street journal story about it is take science fiction. Seriously. World leaders sound alarm on AI. So, um, yeah, look, I don’t want to continue the debate. We just went through these points. Um, and then Google brains co founders has big tech companies are inflating fears about the risks of AI wiping out humanity because they want to dominate the market.

I agree with this. I think this is, look, I mean, I, I think the, you can’t agree with [00:58:00] this and not the regulatory cap. You can agree with it. You can. I mean, I think that’s the point is like, we’re supposed to be able to hold two tough things in our head at the same time. Like, it’s not like one thing is true and the other thing is not true.

That’s like, that’s like the all in show. We’re not the all in show. You know, I can say both of these things are true. I can say, The United States government absolutely has to do something on the regulatory front. And at the same time, the United States government is. You know, largely influenced because we’ve allowed, you know, private interest through unlimited money to influence elections and, and lobbying.

There’s no question that Sam Altman went up there, you know, and said, Hey, we really got to watch out for this so that they could be the drafters of the regulation so that only they could pay for the barriers, like, I get it. I, I get it. I, I know that it’s suboptimal. I, I mean, I, I very much understand I, and I, and I get the dilemma.[00:59:00]

I, I, I hope that it doesn’t land that way. You know? I hope that. The government can see that that is where this is headed and see how bad that has been in the current big tech oligopoly and is moving towards a much more, you know, diverse model as it pertains to how AI will be created in the free market, but this is a risk.

I get that. I get that. I get that.

Vic: Yeah. So, I mean, if, if, um, if they formed a group of, I don’t know what it would be 20 AI researchers and 20, uh, in big tech for profit researchers and got together some kind of think tank commission, something that have a chance, but I, I think it’s going to end up being mostly information from big tech.

And well, they’re going to have, they’re going to have, you know, not [01:00:00]

Marcus: total information, big, big tech are the leaders. On AI right now. I mean, you know, they’re bias. They’re super bias.

Vic: Okay. Well, we agree on that. Yeah. Yeah.

Marcus: No, no, no. Look, it’s it’s it’s a I’m glad I don’t have to work on this problem because I recognize it’s a really, really difficult problem.

It’s a really difficult problem. And I do not trust these entities to regulate themselves at all. So we’ll, but I also don’t believe the regulation will be void of their influence. I very much am, you know, aware that that’s going to happen. Yeah. So we’ll post

Vic: this. This is a good article. We’ll post this in the show notes.

Yeah. It’s from Business Insider.

Marcus: Right. Uh, and then, and then the, the final thing on this, on this, Front is Bill Gurley, uh, who’s the founding partner, right? Of benchmark, um, you know, super VC fund, uh, and sort of one of the members, you know, one of the auxiliary members of the, the all in [01:01:00] show. He’s not one of the core four, but you know, he, he’s on there pretty, you know, somewhat regularly.

Uh, he did a presentation at the all in the most recent all in summit. It was an unbelievable presentation.

Vic: Yeah. Watch, watch this. Regulatory capture, like defining it, explaining it, illustrates it with examples. It defends VIX.

Marcus: Uh, point very, very well with fantastic examples, and it’s a real concern, and it’s something that we need to be aware of.

And as this AI regulation is happening to the degree that you can be involved, you know, if you’ve got network, if you’ve got influence, you need to be pushing more voices,

Vic: especially in our audiences, largely health care, more voices in how health care should be regulated to get to the right answer would be great.

Marcus: Yeah, because the regulation is

Vic: coming just

Marcus: like I was right. Yes. About this. You were right about that. It’s going to keep coming out where it’s coming. It’s going to keep coming. And so what we have to do is we have to lean in, use our influence and, and, you know, organize behind, you know, I would [01:02:00] imagine groups like engine, uh, that, that, you know, they do a lot of lobbying on behalf of the startup industry and the VC industry.

I would imagine that might be a good place for, for, and I’ll, I’ll put a link to them. I have no idea if they’re even working on this, but I bet they are. I’m sure they are. I’m sure they are working on this, but anyway, this presentation is 36 minutes. Uh, Uh, not only is it informative, it’s super entertaining.

I was very impressed. So as someone who does keynote speaking, he crushed it on this. Uh, so definitely it’s, it’s worth the watch. It’s got 442, 000 views right now on YouTube. So, so, you know, you’ll be, you’ll be in good company. All right. And then the final thing, which is a really positive note and exactly why we cannot stop the progress on AI.

It, and by the way, from. Alpha fold had a, a, a, a next generation breakthrough, uh, in their ability to model, uh, proteins. Yeah. So

Vic: proteins fold in a very complicated [01:03:00] 3d way. Most of our audience probably knows this better than, than I know it, but the function of a protein is. Very tied to the shape and configuration of the protein and AlphaFold, it’s their third version of this, but they have artificial intelligence technology that’s designed to model how a protein would fold based on its genetic makeup.

And what’s interesting about this one is they’ve extended it to something called ligands. Ligands. And the other RNA and things like around the protein, ligands are, are, I don’t fully understand it, but they somehow attach to the protein. And I think there’s a lot of disease treatment, new drug discovery that is sort of at that interface of where in the protein the ligand is attaching and getting the shape right.

And they have modeled, I think, Every possible protein and ligand that exists,

Marcus: [01:04:00] and they’ve, they’ve validated the efficacy of it because they’ve matched it to sort of previous models, experimental, uh, you know, extracted models and yeah, where

Vic: they had the actual, they had the actual shape.

Marcus: Um, so, so, I mean, what’s incredible about this is you no longer need the actual protein, like they’ve, they’ve figured out how to generate a model that is, you know.

Vic: And you can look at sort of disrupting, uh, disease progression in, in silken, but you can be, you can be simulating and modeling thousands, millions of potential on computers and then you find the 1, 2, 10 that have the best potential and then you take it to the lab, right? And the acceleration of treatments for human disease is going to be incredible.

They’ve already got some examples. We’ll link to it. And so this is AI. And it is very [01:05:00] powerful. And it’s moving very fast. It’s moving very quickly. And one of the things that makes me concerned is this is not, um, this is just data. But like, modeling every protein and every ligand, you can, most people can, will use it to try to find therapies and cures and solutions.

You could also try to find a way to interrupt the positive way a protein proceeds, and I don’t think we should stop this technology from existing. But of course, we need to try to keep people from using it for bad. So, I mean, it’s both ends. We don’t want to stop the innovation and the advancement. And we, we, of course, want to keep bad actors from using it.

But with the CRISPR Cas9 stuff, you can sort of replace pieces of DNA. You now know what the shape is going to be. It’s really powerful stuff.

Marcus: Yeah. And they’ve got this picture here of their [01:06:00] model, um, you know, against, uh, you know, an examined model of, of DNA and the ligands. And it’s just like, it’s just unbelievable.

Like how. How they’ve nailed it. They’re right on top of the old model now. And it’s just computing power, computing power and data and, and a transformer model. I mean, and

Vic: it’s a, and it’s a, like a foundational tool that researchers around the globe are going to take and then uploading this to a

Marcus: database.

And they said they’ve already got, I mean, how many, how many users? Or on this thing. I mean, it’s a lot of people have already been able to access 1. 4 million users in over 190 countries, open source have access to the alpha fold database.

Vic: Yeah. So with all of this controversy and us debating, I think it’s good that we’re ending the show.

There’s, there’s incredible advances and it’s going to be net. It’s going to be great for humankind, but it’s super powerful and we need to try to use it for good.

Marcus: Yes. [01:07:00] I think that’s a good note to end on. Yes. Our first proper debate, man. Um, yeah, you’re, you’re right. As usual, but, uh, no, no, no, no, no. You, you’ve got, you look, you’ve, you’ve got important points and I think, uh, they need to be Taken seriously.

Um, our government is unfortunately in a very compromised position right now. And it’s, it’s been a series of events that have sort of led us to this point of compromise and, um, we, we should have a healthy skepticism of, of their abilities. We should have a healthy skepticism of their abilities. I agree with that.

And, and, you know, as, as citizens that have a voice and have influence, you know, this is, we’re at that point in our lives where, you know, if we, if we believe strongly in something like this, you know, you’re, you’re in your early fifties. I’m in my late forties. It’s like, we kind of have a responsibility to at least talk about it on this show.

But probably more importantly, you know, talk to our elected officials that we [01:08:00] know and say, Hey, here are the things we’re concerned about. You know, you and I both know the standing United States senators for the state of Tennessee. You know, we probably should be reaching out and just at least sending him an email and saying, Hey, listen, this is contribute.

Yeah, yeah, because it’s it’s too important. It’s this is this is actually way more important than most people understand, because we’ve got these really pressing, you know, war issues around the world. And we’ve got the trial of Donald Trump, and we’ve got the house nonsense going on, but this this is like the thing that one day will just pop up on the entire world and, um, it will change everything.

Vic: Yeah, and I think we don’t get it

Marcus: right.

Vic: It’s easy for me to throw stones, but I think it is. It’s more important and I will try to be constructive and add to the you know Add to the to the debate try to try to contribute to it because I think that’s we all should try to do that Yep. All right, man.

Thanks for putting together another

Marcus: [01:09:00] show

Pin It on Pinterest