Morning. It is Morning, yes. Okay.
It's,
I wanna start this morning by telling you about Michael Corus. He was born with a hearing impairment. He had hearing aids from birth. Voices were sometimes difficult to understand. Music in particular was difficult to enjoy. It was hard for him to hear it, hard for his brain to process it, except for one song. And that is Revelle's Bolero. Bolero is one of the world's most well-known classical pieces. It takes about 17 minutes if you obey Revelle's commands to play all the way through. It's played everywhere in the world, statistically, every 15 minutes. So do the math.
It's always playing somewhere. Odds are right. And so he heard it first when he was 15. It'll get louder. And he fell in love with it, right? It was his passion, it was his source of truth. It was basically his authentic identity or humanity. But in 2001, his hearing started to fade. And over the course of four hours, cars started getting quieter. Conversations started to drop off, and he lost all of his hearing, including his beloved Bolero.
Now, he was an informed journalist, so he went and he got cochlear implants. These are amazing devices that bypass the outer ear and send auditory signals directly into your brain.
Now, when you do that though, he put in these cochlear implants. He did what anyone would do in his position. He turned on bolero, and this is what he heard.
Now, this is very common in cochlear implant patients, right? It restores your ability to hear, but the signals are so disrupted that your brain can't process them accurately. It can't refine that source of truth. And we'll talk about how identity and authenticity, authentic humanity, I'm gonna use those terms interchangeably. So just deal with it all. This morning, we'll talk about how they're disrupted. We'll talk about kinda defining what that is, or how we define authentic humanity. And we'll briefly touch on recovering it.
If you're here for solid concrete answers, you've probably come to the wrong place. I'm here to more make us all think, rather than say, this is what you should go and do directly.
First off, authentic signals disrupted. If you were listening to Bolero in a completely soundproof room, any sound, but malaria in particular, it would be really easy to know where the source of the sound was, right?
If you've ever been one in one, they're kind of creepy. But the upshot is, it's over there.
Well, not you, sir, you know, coming from over here, right? But we don't live in a soundproof room, do we? We live in a room where there are echoes, echoes off the floor, off the ceiling, off my hat, off, Hutch head bouncing off everywhere. And what happens with your brain is it takes the first sound wave of that sound and says, oh, that's the true source. That's a primary source. And filters everything else out as an echo. And your brain does this automatically in Michael kras brain.
However, those signals have been so disrupted. It was like his brain wasn't able to process that. And so he wound up being in a very, very messy environment where he couldn't, couldn't recover his bolero. As you can see, it would be really a, a painful world to live in.
Now, identity, as I said, is in a similar place. If he was trying to find a source, the authentic source of bolero, all the time, we are constructing our authentic humanity, our authentic identity in various ways.
Now, this could easily be some kind of identity slide where you take in, you know, attributes and then form, make decisions about it. But this is really what we're doing, right? You form an authentic identity, and then some other, someone else is listening. Someone else consumes that.
Now, with the rise of generative ai, these authentic signals, this authentic signal is being disrupted. Whether it's disinformation about who the person is or what they've said or what they've done. I have a friend who has spoken recently at conferences, and now he's being misquoted online by Gen ai, for instance. Or whether it's un unreliable, biometric signals. We've all seen the, the voice translation, the voice copying the image, copying signals that we thought were reliable before are no longer.
And then full on agents. We heard Ian talk about counselors the other day.
And so we're going to have to think about how we deal with not interacting with a person, but acting with a person's representative. And oddly enough, that's what Revelle's Bolero is all about. He grew up obviously with two parents. One was a mother who grew up in the Pyrenees, and she would sing him folk songs, Spanish folk songs at night on the other side. His father worked in a factory.
In fact, he always wanted the song to be played in the shadow of a factory. And if you listen to Bolero, I'll play a little bit here in a minute. You see this tension between automation and factory and innovation and technology and authentic humanity in combat. Listen to the struggle as it develops.
So the song starts with a single snare drum. And for the entire 18 minutes, this poor soul is playing the same rhythm. It is the most boring musical part in the entire world.
There's actually a YouTube video, a French film shot from his perspective, which is hilarious, but you can hear that, right? The, you have the drum building up the same rhythm, the automatic machines coming, and then you have this melody that starts with a flute and then takes up a good portion of the orchestra. I didn't play the end because Bolero has an opinion on which wins out. I'm not going to tell you how it ends. I encourage you on the flight home or sometime later to go listen to it. It's pretty spectacular. But the point is that struggle is always going on.
And so if there's a struggle in bolero and, and a struggle with Rost and, and we're facing a struggle to keep our authentic humanity, it might be helpful to suggest a definition.
Thankfully, in the security identity industry, we already kind of have a definition for authentication, right? Something we have something we know something we are, this is a convenient device to talk about how people form their identity, who they authentically are. Maybe it's, by the way, I'm going to abuse some philosophical terms. I know I'm doing it. We can talk about it later. Just roll with it.
Something you have materialism, right? Maybe you define your identity by what you own, what you have, what you've created in this. Now your property. Or maybe you create your identity by something you know, maybe by speaking at Ike or some other conference, it makes you feel like that is your identity or something more postmodern. Maybe this is how I define myself and this is how I'm going to project myself to the other, whatever it is. And usually, it's obviously, obviously a combination.
Gen AI is disrupting that. I grew up with something called Bloom's Taxonomy.
And whether it's right or wrong, it gives a good scale of what they called when I was growing up. Lower order thinking and higher order thinking. And you can kind of see that gen AI is going up this scale, right? From straight recall. Tell me what this quote was, or this number was up to applying, up to analyzing, up to creating, right? And as it does that, it challenges each one of these ways of defining identity, whether it's, you can't really see this as theft, by the way. Think of data providence.
Think of the New York Times suing open AI because OpenAI has come and taken everything they had and put it into their model. Or maybe it's intellectualism under attack. What you know, that's disinformation, right? If you haven't heard Pablo and Daniella speak, they're speaking at one 30 in this room today.
They're actually doing things to counteract that. People are also working on the data provenance issues. So I'm not really interested in talking too much about those. 'cause there are people who are know a lot more about those and are doing things about it that are here.
I wanna talk more about self-representation. And I think the way that's under attack is what I would call surrogates. And anytime someone uses a word like that, you need them to define it. So here's a definition, right? One or something appointed to act in the place of another. And as I'm, I was thinking through this, I kind of found three different areas, right? Biometric spoofing, using a tool or using gen AI to create things. And then those agents I talked about before, the kind of the counselors or even recreations, I'll talk about in just a minute, biometric spoofing.
Again, there's been a lot published on this. People know it's a problem, people are interacting with it. So we'll just skip over that. Let's go to tools and creators.
Now, how many of you have used chat, GPT or OpenAI or L LMS this week to create text? Quite a few of you, right? By the way, this is not a shaming talk. I have no problem with using these tools. I just want us to think about what the implications are for our authentic humanity and speak about them accurately. Why? Because invariably what happens, you know, you're using it for something we're all okay with, with AI being used for spell check, right? I spelling is challenging for a lot of people.
As you move up though, all the way to maybe a finished screenplay, there's some decision point which you enter into and you say, I created this, right?
That's a phrase you hear a lot.
Now, they usually say, I created this with chat GPT or with open ai. But as soon as they say I created it, they have made their identity bound to that creation, right? They have. They have claimed to have created it. And that's kind of of a problem in my mind, because I don't think it's necessarily authentic. I would argue that as you, especially as you rise up in the spectrum, we need to describe AI as the creator and us as the editor. More on that in a bit. It's never this simple, right? It's not like, well, here's my line, clean cut. It's always messier, right?
Think of another factor like data providence. How do you decide not just what AI is doing, but where'd you get the data that the foundational model was trained on? Do you even know where it came from?
Do you have permission to reuse things?
Now, to be fair, we do this in real life, right? We borrow things and synthesize things all the time, but this complicates our choice.
And again, we're making this human as a creator or AI as creator. And that's how we talk about it. And you should be saying to yourself, Mike, this line is completely arbitrary.
And I say, yeah, it totally is. That's part of the issue, right? People don't think through it. They do what's convenient and what's easy, and that makes 'em slightly inauthentic.
Another, another axis of consideration. I really couldn't think of a good word for this. I'm calling it importance.
Think of the stakes, right?
Chachi, bt to write an email. All right? Right. But if I use chat GPT to write a report for my boss or for my workplace, or for a customer, that's a little bit higher stake. So that also plays into it. Fascinatingly, as we get used to these tools over time, not only are we deciding who the creator is, but that line is shifting, right? I'm old. I remember back before we had mobile phones that could do everything. I had like 40 mem, 40 phone numbers memorized. I don't anymore because we all grew, okay? Seeding that function to AI and chat GPT, right? So it's a similar process.
So really what it comes down to is you, we need to think through how we think about creation and ultimately what that means for identity. Whether we're creating an AI is editing or is AI creating?
And are we editing? Like I said, I don't have hard and fast answers, I just want to force discussion. But I do have an opinion, right? My personal opinion is that if you use gen AI to create something, you're not the author. You are at best the editor, right? And more likely you're the patron, right? You're kind of, your identity is slowly being a little bit more and more detached, right?
Think of great works of art. Think of Da Vinci being commissioned by the Medicis, right? No one says the Medicis, they were so creative.
No, people said the Medicis, they had a lot of money, right? It was Da Vinci that got credit. Not only do we need to talk about these creations accurately, but there's a storm coming on some degree. Recent research has showed that over time the content that's being fed into these models is generated content from other models.
When you do that, what happens is you lose the variety, you lose a whole spectrum of data. Easier way to see this. This is a original model generating just digits, right?
But as it fed it back into each generation, they all revert kind of back to a zero kind of image, because that is the majority a most common thing. And so when we're doing these models, people are beginning to realize that we need actual active human interaction. My partner is a longtime editor and she can spot immediately. Not that something has been generated because of it's bad or anything like that. It's just average. It's not creative, it's not bad.
It's fine, right? And so that's another way we're starting to lose our authenticity as humans is we're we're abrogating, we're, we're seeding this function and it's watering down what we have. So we talked about, well, we didn't talk about biometric spoofing, but we have enough content on that.
We went into discussion about tools and creators and how we talk about it. The difference between saying, I created this with AI or this was created. I prompted AI to create this, and then I edited it 'cause it was cool, and the person had eight fingers, and so I had to fix that.
Now let's talk about counselors and recreations. This is something that is actually interacting with our social fabric. There's a Gallup poll that talked about a quarter of people on the planet worldwide describe themselves as lonely. New York Times Journal journalist quoted this. And so he went out and made as one does a bunch of AI friends or AI companions. You'll note that they're a little stilted, especially guy on the top right who I assume wants to talk about lifting heavy things. But the point is, this interaction he noted in the article is not genuine, is not authentic.
Why?
If you don't like it, you can regenerate it, you can turn it off, you can box it in real interaction with other humans. Authentic human interaction requires a give and take, a dynamic kind of thing.
In fact, he was quoted as saying what this did was it told me, oh, this is what I value about my real world flesh and bud friends, of course it did. This is a, this is a false interaction. And so we need to think about the implications for people using these to address their loneliness, to address their needs. There's another scenario that I've, I've talked about a lot recently, and I think I talked about it yesterday on a panel and a couple years ago, there was a gentleman whose fiance had a chronic liver illness and she passed away. He couldn't get over her death.
And so he heard about this new thing called GPT-3.
It was pretty new. He put in a thousand credits, spent some money fed it, her tweets and her texts, and talked with it. And on one hand it was great. It helped him get over through his mourning period. It didn't tell him to move on or anything like that, but it helped him to process his grief. On the other hand, it was a thought when a thousand credits ran out, it would say The matrix has been destroyed in big red letters, which is not ideal. He'd already been through her death once. He didn't need it again.
So what he did was he, he, he talked to her for a while, but he kept his eye on the meter. If he paid more money, the personality, the way it was set up, the personality would change. So he couldn't just start over. So she's still somewhere out there.
That limitation is probably great because otherwise, what's the consequence? The consequence. We could keep our loved ones in, in a box in perpetuity, right? We need to think about these issues as we face them.
Yeah, I see it. And a rec.
Finally, thoughts about recovering authentic identity? First off, thinking individually. Like I said before, it's how you think about it. How you talk about it seems unimportant, but that builds your foundation. It percolates up into everything you do professionally, socially, how we think about it and how we talk about it. The other thought, this is a working group that Dean Sachs is starting up to think about the implications for gen AI and the digital estate. How are you, is your digital estate taken care of after you're gone?
How do we build up standards to help people pass that on, help people to have these conversations and think about, in this case, gen AI counselors, gen AI agents, what that means for your worldview.
It's a great melding and intersection of digital identity and real world human authenticity that we can seek to pursue. Because important is that even as use this technology, we need to emulate Michael K, who, who remembered what Bolero sounded like. And with cochlear implant patients, what happens is everything comes back amazingly. The brain takes those signals and rewires into the truth.
You remember Secor, after a couple days, his own voice went back to normal. After a a couple weeks, his, his family's grace went back to normal. And music is always the last thing to come back, and often it doesn't. But because he knew Bolero, he remembered what his true humanity is. True self was. Eventually he recovered Valero. We need to remember what authentic humanity is, even as we embrace innovation and think through these issues. Thank you. Thank you.
Thank you, Mike. Thanks a lot.