Tech and Innovation for Social Good with Lucia Gallardo

Published May 9, 2022   |   

How do we innovate with technology to solve a nation’s crisis? Can we redirect tech to solving a social cause? Lucia Gallardo tells us how! 

Our most recent guest on Slaves to the Algo says, “When I discovered technology, it was always going to be a tool. So, when I started my company, it was all about how do we take the DNA of these technologies and build solutions around them in a way that democratizes not just accessing this technology, but in a way that is highly inclusive of the access and the benefits, but also of the participation.” 

Lucia Gallardo is a Honduran serial entrepreneur and the Founder & CEO of Emerge, a company dedicated to developing emerging technological solutions with a social impact. Emerge is a humanitarian technology company that enables the more efficient, more humane, and more transparent movement of people, goods, and data around the world.  In 2020, Lucia was named one of MIT Technology Review’s Innovators under 35. And in her spare time, she sits on the boards of varied non-profits like Crypto Kids Camp and Rainforest Partnership, tech organizations like Penta Network and The Caribbean Blockchain Alliance, and WE Global studio, which supports women entrepreneurs across the world. She is Acquisition International’s North American Female Blockchain CEO of 2019. In 2020, Lucía has been nominated for Royal Bank of Canada’s Entrepreneur of the Year award and Future of Good’s 21 Founders to Watch. 

How far along are you in conserving nature? Did you know you could save rainforests through NFTs?  Check out the full conversation between Suresh and Lucia below. 

About Slaves to the Algo  

Whether we know it or not, like it or not, our lives have been taken over by algorithms. Join two-time entrepreneur and AI evangelist Suresh Shankar, as he talks to leading experts in various fields to understand how they are using or being used by algorithms in their personal and professional lives. Each episode highlights how businesses can leverage the power of data in their strategy to stay relevant in this new age of AI. Slaves to the Algo is brought to you by Crayon Data, a Singapore-based AI and big-data startup.  

Suresh Shankar is the founder and CEO of Crayon Data, a leading AI and big data start-up based in Singapore. Crayon Data’s flagship platform, maya.ai, is the AI platform powering the age of relevance.  

How to listen to our podcast  

Apple Podcasts  
Spotify  

Google Podcasts  
YouTube  

Full transcript of the episode below: 

Suresh Shankar   

Hello viewers and listeners. Welcome back to another episode of slaves to the algo. I’m Suresh Shankar, founder and CEO of crayon data and AI and big data startup from Singapore’s a podcaster and host of slaves to the algo. Slaves to the algo is my attempt to demystify the age of data and the algorithm, sharing learnings from myself from other professionals to understand how they are using or being used by algorithms and data in their personal and professional lives. As slaves to the algo we don’t attempt to portray our future as either dystopian or utopian, it is just what it is going to be. But what we try to do is to bring the use of data and algorithms more into our conscious thinking selves. And in this particular episode, we are focused on a separate sub theme of Slaves to the algo which is the fact that at crayon we are constantly inspired by the stories and professional achievements of them and we chose to break the bias. The tech industry has been notoriously challenged when it comes to women representation. 2020 study found that only 28.8% of the tech workforce is women. There has been a steady increase in the past few years. But clearly it’s not fast enough. At this pace, it’s going to take us more than a decade for women to gain equal representation in the industry. And while we do work as an industry are making this a reality, the industry is constantly looking for role models who are blazing a trail already for other women, which is why we’re doing a mini series of episodes featuring women in tech those who are reinventing the technology landscape as we speak. And today I’m particularly thrilled to welcome a social entrepreneur, a lady whose work focuses on creating systemic inclusion. Lucia Gallardo is a Honduran entrepreneur and the founder and CEO of Emerge, a company dedicated to developing technological solutions for the social impact. In 2020, Lucia was named one of MIT Technology Review innovators under 35. She sits in her spare time on the boards of various nonprofits like crypto Kids Camp, the rainforest partnership, tech organizations like the penta network, the Caribbean blockchain Alliance, and WE global studio, which supports women entrepreneurs across the world. She was named the North American female blockchain CEO of 2019, nominated for Royal Bank of Canada’s Entrepreneur of the Year Award and future of Good’s- 21 founders to watch. That’s a lot of stuff that Lucia gets up to, but she does have a day job and that day job is to be CEO of Emerge. What a lovely day job because emerge calls itself in their own words, a humanitarian technology company that enables the more efficient, more humane and more transparent movement of people, goods and data around the world. I couldn’t say that better. Welcome to the show. Lucia 

Lucia Gallardo 

Thank you. Thank you very much. I really appreciate the focus that you’re giving the show and you know, I think there’s a lot of really thrilling work. I think there’s a lot of women doing interesting work in this space, but very often they don’t get the mic. So I’m super excited that that’s what you’re featuring for this series. So thank you for having me. I’m really happy to be here. 

Suresh Shankar 

And it’s not just about women not getting the mic. I think it’s also the fact that social entrepreneurship doesn’t get the mic. And so we are actually hoping to kill two birds with one stone in this show. But let’s say I always like to start the show the slightly more personal way, right? I mean, you know, we’re all professionals, we use technology, we use data, and we are affected by the developments in data and AI. But at the end of the day, we’re also human beings, and, you know, we all understand the data impacts our lives positively and negatively in many different ways. Could you share some examples of something that you think hey, wow, you know, the data really made my life. Fantastic and insightful or possibly miserable? 

Lucia Gallardo   

Yes, I’m miserable more often, because I play with it so much. Yeah, I mean, there’s like, very interesting, interesting use cases, I’m a particular fan of like, very, like simple regressions that I think just make the right relationships between datasets. And I’m also a very fond of anything that accounts for, for randomness. And I’m a big fan of anything that optimizes image or object recognition. So those are kind of like the areas of life where I, I live, in the case of object and image recognition, we were piloting this solution ones that related to truth telling, in like the context of truth telling, because obviously, you know, when you take a person’s testimony of something that they’ve experienced in life, I think that it can be really hard to vet what’s true and what’s not true, because people experience things in a particular way. And so there’s something about, you know, the way that people talk about fake news that makes it so that they want to solve for truth. But the but it’s really impossible. To me, it’s impossible to solve for truth, because it’s just the way people see the world. And it’s informed by who they are as people, the experiences that they’ve had the way they view the world, their opinions, etc. But what you can solve for is like the surrounding context of truth. And so, you know, does it make sense that this person would, you know, say this, in the case of the the pilot use case that we did, it was related to the migrant caravan, like, did their, you know, GPS mark that they actually were walking across Mexico at the time? Did they receive a service anywhere in Mexico alongside the caravan? Were they following the same cities as the caravan? You know, are they from the country of origin? That they’re saying? Does their accent sound like that? Do do, do they have any imagery to support that? And so then you start piecing all of these, like data points together, and you start making like rendering outcomes of saying, like, yeah, this totally makes sense. Like, you know, it adds up. And so we were experimenting with that, because we understand that the current approach to fake news just cannot be solved by looking for truth. But it especially in the cases of personal testimony. And so when I think about the applicability of contextual evidence, and whether like contextual evidence can support statements, then you can look at the applicability of that in many different situations in sexual assault in any cases of whistleblowing in cases of, you know, humanitarian and just displacement and saying, Okay, this, you know, roughly everything this refugee saying actually makes sense. Like we we want to resettle them faster. And so when you start looking about solving a problem from a particular, a particular approach that does not actually solve for something that is almost impossible to solve for, if not totally impossible to solve for, then I think we open ourselves up to being able to set precedents that could solve other problems and better ways. So I’m particularly interested in algorithms that can piece together data sets that are from a very, very, you know, wide types of data and sources, as well as anything that supports contextual evidence. And I think that there’s a lot that can be done with that. And then I love everything that I guess like my wallet hates that social media algorithms are really effective on Instagram to show me what it is that I don’t have, but want. My Instagram is actually full of travel destinations and restaurants to be frank, it’s become quite problematic. So if I’m being light hearted about it, I guess I have a love hate relationship with those algorithms. 

Suresh Shankar   

You know, it’s, it’s refreshing to hear the first part of what you just said, because everybody who comes on says, Instagram does that Netflix does this Amazon does this something does that and the first part of what you just said about how data and algorithms and can be really used to do so many different problems, you know, news, you know, refugee trains, you know, so many you know, sexual assault, you have so many different ways in which data can actually change the world. But I want to go back a little bit in time. Lucia, one of the things that you said is that Emerge has been inevitable since I was 12, when I was 12, I knew nothing except I wanted to play cricket. Okay, so 

Lucia Gallardo  

eight, since I’m eight, 

Suresh Shankar   

eight, okay, that’s even, that’s even better. Okay. There’s obviously something that you knew that something that gets you going and that’s that’s made you the entrepreneur some defining experiences. So how are you socio? What shaped that journey? 

Lucia Gallardo   

Yeah, you know, I, I think technology is like a very interesting thing because it’s just very neutral, super neutral. You know, you can do great things with it, you can do terrible things with it, it’s always something that we inject our, you know, intentions and our and our purpose. And sometimes we have good purposes that turn out bad, you know, that has happened before we’ve seen that happen, and vice versa. And so, basically, the reason I make that joke is that I started emerge when I was eight is because I, that’s where my intention was formed. So in 1998, I’m currently aging myself, um, my home country, Honduras, where I was living, and I was born and raised, it went through this very intense natural disaster, it was a hurricane. And it like swept the country, it destroyed hundreds of 1000s of homes, and at least I think it’s at our they calculated it set our economy back 150 years. So it was very devastating to so many people, I was very fortunate that my house lost electricity for about two weeks, but that was really, you know, there was a little bit of flooding, but that was mostly what, what I went through as a you know, but at the time, my parents, they were organizing donations. Um, and so they would bring my siblings and I and they would make us help. And so, you know, we would hand out water and food and clothing and anything else that that we needed to. And there’s something about, you know, you can be 8, and not understand the world and not understand the economics of like losing your home in a country that has 62% of the population living under the poverty line living under $2 a day. So you don’t understand that as a child. But when you see desperation, and when you see, like the intensity of human need, and a crisis, I think you never really forget that. And I just didn’t, I couldn’t. And so I became very interested in volunteering. And you know, this idea of like generosity and compassion. And my parents very, very much fostered that. And then when I was 12, which is the other big memory is I was in a science class and my teacher, Mrs. Heaton, she gave us an assignment. She said, You know, anything we’ve learned this year, just take it and apply it to Honduras, like, show us how it’s a real use case, which, you know, this this podcast is doing. And you’ll see the impact of it, because when I did apply it, I chose the theme, the theme of water. So obviously, between a hurricane and this theme of water, it’s been a very consistent issue in my life that’s led me in very into very good places. So I went and I took a tour of a water purifying plant, because we learned about water purification. And don’t ask me a single word about water purification, because I really don’t remember. But on the way down from the plant, there was this neighborhood that lived right next to the largest power, the largest water plant in the city. And I was like, let’s add, you know, very naive 12 year old me said, let’s add a human element to this project. So we get out of the car, and we knock on the first door, and we ask if we can interview them about what it’s like to live next to the plant. Because, you know, I was thinking like a human element of saying it’s noisy, it’s inconvenient, because there’s trucks coming in and out, etcetera, etcetera. And she said that their community didn’t actually have running water. But they live next to the largest water plant in the city, and they just didn’t have it. And they would collect their water in different ways, either from the rain or from a well, 90 minutes away, or they would buy it when they could, or every two weeks, whatever the city didn’t use, this water plant would donate to them. So every two weeks, the plant was actually like they interacted with the plant to receive that donation. And I remember like, very clearly I can still feel it. When I think about this memory is like the progression of emotion that I went through in thinking like at first, you know, I was very sympathetic and sad. And that was the the wrong approach. They didn’t need my sympathy what they needed was someone that like wanted to understand why this issue happens in the first place. What kind of society creates intergenerational poverty and cycles that are super hard to break out of? And why is there such deep inequity in access to basic needs, like water, or shelter, or electricity, etc. And so it kind of became like, inevitable that I would work in the social impact space. Obviously, I didn’t know how I was going to do it. And for many years, I thought it was going to be through public sector work, but then I did do public sector work and obviously I’m not there anymore. 

But I think I think for me, it was just inevitable and then when I discovered technology, I immediately to me, it was always a tool. It has never been an interest necessarily in like, Oh, I think this technology is going to save mankind because I I am very cognizant of the fact that it’s always changing and always needs to be improved. It’s, that’s the way we need to look at technology. And so when I started my company, it was really about this idea of saying, we have all of these tools at our disposal, they’re changing exponentially, and they’re making possibilities out of, you know, the way that they’re changing. And so how do we take the DNA of these technologies and build solutions around them, but in a way that democratizes not just accessing this technology, so that it’s not just big corporates that can afford to play with them that that receive the benefits of them. But how do we do that in a way that is highly inclusive of the access and the benefits, but also of the participation. So to me, it’s very important to use teams that are very diverse around the world that are very diverse in terms of life experiences, of educational achievement. And so, so it was intended to be a company that brings about, I guess, or sets precedents in technological justice as a concept in saying we want to, we want everybody else to be be benefiting from this technology be participants in how we design the next technological waves. And also, to just, you know, be able to feel like it’s actually helping them and in very real ways, and addressing problems that that need to be solved, because we’re living in a world that is getting unfortunately, statistically worse, and for many, so 

Suresh Shankar   

first of all, thank you. It’s such a wonderful story, right? I mean, you know, many of us do go through difficult moments in life, not too many people have used as a foundation to actually choose a path that’s very different. And the fact that you did it, and then hats off to you. The social part, I understand, where’s the tech part? What are the tech? How did that tech part get triggered in you? 

Lucia Gallardo   

I was so done with the public sector, I just felt I was exacerbating issues. And I had been working in diplomatic affairs. So I was dealing a lot with like international relations, international trade, etc. So I had that a lot of experience with international markets. And so at one point, there was a tech company, and I received two job offers. I was living in Montreal at the time I received two job offers, and I was very surprised by them. So the first one was from an artificial intelligence company that was doing predictive price modeling for airfare. That’s called hopper. It’s now like a unicorn in Canada, I think. But it was at the time, there were less than 20 people working there as a very tiny company. And they said something like, you know, we want to grow out of Canada. We just don’t know, don’t know the international markets very well. And I’m like, great, because I don’t know AI very well. And so then the other job offer was, was this, this nonprofit organization that was taking hardware engineers, and neuroscientists and make making a community out of them, putting them together in rooms, so that they could just see what would happen. And so obviously, we have this like wave of neuro tech coming out of Montreal. And this was like a big reason they were a Firestarter in that. And so they said, We currently have these, like little chapters in Boston, Montreal, Toronto, and San Francisco. But we want to be global. And I was like, cool, that I have no idea what you’re doing or why but I love it. Like, let’s let’s, you know, take it global. And so I took obviously, you know, startup culture, it’s very friendly for saying you can work whatever hours were very flexible. And so I took both job offers, and I just dived so deep, like once I was in it was I was so curious. I am I think I described myself as an intellectual omnivore, because I just want to know how everything works. So I, I just got very curious about the technology. And the more that I learned about it, the more that I wanted to know more. And so I just went down these rabbit holes of technology. And I immediately started seeing, making connections between, you know, possibilities, and neuro tech and possibilities and AI. And then I started looking at other technologies. So I started learning about IoT, and blockchain and then very slowly, my mind just kind of like, puzzle puzzle made its way into understanding that, like, if all of these are tools, why aren’t they directed at the mission that I care about? And so that’s sort of how it how it happened. 

Suresh Shankar   

And that’s a lovely backstory, you know, Steve Jobs talks about connecting the dots backwards, and you just connected the dots of your life beautifully backward in a, in a lovely way to tell us I think, have you reached here. But Lucia, coming to where you are now, your social tech entrepreneur and I think that’s defined as people who are not just saying I’ll build cool technology and yet another app to sell us something more, though we all love that part of it, but it’s also something that’s actually about doing good. Could you share with us some examples of you know, social tech entrepreneurs that you meet in your very things are making a considerable impact with  innovative use of data , of AI , of tech? 

Lucia Gallardo   

Yeah, so one of my projects right now that we’re sort of highlighting is called the AEternals. We’re co founder of the project with an agency called Digi Go. And one of the things that we were thinking about when we first started was this idea of the current trend NFTs. And so non fungible tokens are essentially a digital asset, where you’re creating a chain of custody of how the transactions inform sort of the journey of the asset, as it changes hands. And as you know, its valuation changes, etc. So, so we were taking that, and we were seeing a lot of projects come out of that, that were, you know, like, they were really nice pieces of art, or they were profile pictures that were humorous or things like that. And so we looked at that, and we’re like, actually, this could be a very cool thing, if we played around with it if we made it more complex. And so how can we make it more complex. And so we came to this project called the AEternals. And it’s essentially for rainforest protection. So we built in a three dimensional, well, actually, we’ve got 10,000 3 dimensional NF T’s that start out as like baby plots. So the flowers are closed, the trees are tiny, they’re seedlings. It’s very, very small. And then what we did was we started feeding it data. And so this data relates to your engagement toward rainforest protection as a cause. So when you donate to the charity that we’re working with, that, you know, does rainforest protection work, when you participate in community events that relate to learning about the rainforest, or AMA’s, or things like that, when you engage with our games, so we built a game for it. And so you can plug your NFT into the game, and then explore it and play with it and nurture it when you do that. And, you know, when when you just engage with the community and the cause, the more that you do, the plot starts to blossom. So it starts to like the flowers open up, the trees grow, eventually, you can get to levels where there’s like new features on the plot itself changes. So it just becomes a lot more visually complex. And the fun part about that is also that when you go quiet, and you stop engaging, stop playing stop donating, the plot starts to revert back to its basic state. So it is intended to be a reflection, like a live data reflection of like, how really committed are you to the cause of rainforest protection? how active are you in this in this journey, and then what is your own by default measure of impact of personal impact toward the cause of rainforest protection, 55% of the proceeds of this project are going to support the work of rainforest protection. And we also worked with our partner company,Tresorio, in order to decrease the amount of carbon emissions that this project was producing. So we were if we had done this project in the US, and but with very traditional data centers, we would have probably emitted around 68 to 71 tonnes of carbon. But because we worked with Tresorio, in large part and took some other measures in order to make the asset lighter in order to add our own layer to to Etherium to make this project possible. The combination of that decreased the the carbon footprint quite significantly down to single digits, actually. So it’s been quite a journey for that project. But really, the data is intended to be a social movement around rain forest protection, and it just manifests in the form of an NFT. That is generative. It’s evolving constantly.  

Suresh Shankar   

that’s such a fantastic idea. Because you’re combining data, the blockchain and you’re actually doing it to social good. Why isn’t this global? Where can I get this? How can I play with it? Yes.  

Lucia Gallardo   

Yeah, it’s gonna be launching we are. So we haven’t opened the sale yet. That’s why you haven’t got one yet. But you can visit aeternals.io. And if you join our Discord community, that’s where we sort of announced what when, when we’re going to launch it. So it’s already, I’d say, like, maybe 98% there. But obviously, it’s a very bad market right now to launch things. But also, what we want to do is we want to make sure most of the features are completely live, by the time we get to market, there’s been a lot of instances in the NFT space where there’s been like a lot of promises to deliver features that never come. So we’re trying to make sure that like as many features as possible are live. So the game will be live from the get go, you will already be able to to evolve your NFT from the get go etc. So we want to make sure that people see that this is actually a very well intended project. And that is it has a long, long time strategy. You know, other things I’ve done a lot with is that is in the realm of digital identity, obviously. And I just actually last month finished a consultation with the United Nations Development Programme on the applicability of new technology and namely blockchain in the in the case of creating legal identity systems and civil registry systems. And so currently, I’m working on a pilot that does play with a fair bit of different types of algorithms as well as NF Ts to test out a new approach toward legal identity and civil registry systems. So they would allow for a lot more pseudonymity or anonymity were required and minimize the amount of data you share with different institutions, organizations and companies. And it would also allow you a lot more control and visibility into the sharing of your data and who who is accessing your data. So, we’ve played a fair bit with digital identity throughout the years, of course, a lot of my, the awards that you mentioned in the beginning, coming from our work in digital identity for displaced populations. 

Suresh Shankar   

And that’s, and that’s an area that I just have wanted to delve into a little bit as we’re going into the age of data, algorithms, identity, facial recognition, recognize everything. I think there’s a big challenge between privacy there’s always this trade off between actual convenience, personalization and privacy. And you’ve done some very interesting and you know, one of the great things that it all came together and clashed was when COVID happened when India doing Aarogya Setu to Australia had its own COVIDSafe, save Singapore has Trace Together. But you worked on an extremely interesting app called Civitas. With a Honduran government that balanced privacy that allowed people to get out. So could you tell us a little bit about that? Because for me, this is a great case of technology data all being harnessed in just the right way. 

Lucia Gallardo   

Yeah, that was kind of like an emergency. We can’t sit here and do nothing kind of thing. So we like started playing around with a system. And the intention was really this understanding of like, how do you build identity in a way that protects privacy and minimizes disclosure? And so what is it that you’re tracing? And how is it that you can create the types of insights that people need in order to make policy decisions, but at the same time, not sacrifice the patient themselves and so we worked with a company called penta, they’re our partner in designing a lot of the solutions, we work very well together. And essentially, that’s what we created, we created a way for patients to call 911 and work through what some of their symptoms might be, what they were feeling, or whatever. And what we had noticed was when the pandemic first started, every hospital was attending cases. And the problem with that was that because we weren’t so well versed in the symptoms of, of COVID, and people were experiencing different types of symptoms, that people with COVID were in the same hospital rooms, as people that were already immunologically vulnerable because they had other sickness, obviously, when a pandemic starts, other sickness does not stop. And so. So our immediate goal was actually we were realizing that people were getting sick at hospitals, because you were mixing essentially what was going on. So the plan was really like this intervention of saying, Okay, first you call 911. And then we sort of put you into these buckets of saying, Okay, well, you know, are the symptoms similar to, to COVID? Yes, or no? And if no, then we revert you to, we send you to like certain hospitals, depending on your urgency, or certain clinics, depending on the non urgency. And then if it was a COVID, related symptom set of symptoms, then what we would do is we would try to gauge like, Okay, we need to send you to telemedicine support, because the symptoms are very light, but we need to monitor, or it seems like this is actually an urgent case there are there specific hospitals across each city. That will take a COVID case. And what they did was they started sectioning off either areas of hospitals or hospitals entirely in order to manage specifically COVID patients. And so what we were doing was we were creating a funnel, and then using that data to sort of create records of like, okay, well, what’s going on on the COVID? Front, right, like, you know, what are we seeing in terms of symptoms, what are we seeing in terms of, in terms of degree of, of intensity, that people are feeling it so, but this is the same approach that we take toward any digital identity project is this idea that like, you can extrapolate data from the person that, you know, that data relates to, to the public. And that said, the person that this does relate to should have visibility into how that data is used? So when you think about digital identity, and the needs of balancing privacy, really this comes down philosophically to Okay, well, where is it possible to minimize disclosure? Where is it possible to add levels of pseudo anonymity? And not? Who is it that really needs to see this data for how long do they actually need it? Like, right now, we’ve gotten basically, like, lifetime access to data when we opt into things. And that’s like, completely unacceptable, and they don’t need the data for that long anyway. And it’s just, it’s a security vulnerability to have it sitting in their servers. And there’s no like, you know, the, the GDPR in the in the European Union. So this is the, essentially like data rights legislation in the EU has tried very hard to give people some sovereignty or some feeling of sovereignty over their data. And I think one of the like, I guess Paramount clauses that people think gives them that is the right to be forgotten. So in Europe, you can compel a company to delete your data from their servers. But obviously, if you take the example of even just Facebook Like you’ve used your Facebook login information to log on to countless other sites to verify your identity and grant access to data across many other sites, so you compelling Facebook to delete it has have like no bearing whatsoever on every other thing you’ve now shared on the basis, you wanted to take a shortcut, and therefore you use your Facebook login information. So this 

Suresh Shankar   

I’m just tracing. Yeah, and you know, I think I’m just gonna go back to Civitas because something very interesting about it because the way you use blockchain and data where if I needed to go to the house, I just need to kind of do something with it. And it would certify that I was okay to go out. But it didn’t give away my identity because, you know, use the blockchain to anonymize identity. And for me, this is the fascinating thing how I need to get access without surrendering my privacy by using something that anonymize it in the old days, we use thing called BGP, and all of that stuff to mask it. But clearly, the blockchain is a more efficient way. So what do you think is going to happen in that? Will we all have this little thingy that anonymizes everything puts it on the chain allows me to be able to see that I’m kosher, but not reveal my identity? It’s 

Lucia Gallardo   

that data, we only put the transaction, the transaction metadata on the blockchain, right? Because for us, that’s another piece that’s really important is what data does belong on a blockchain, what data does not what data should have some degree of immutability? What should not and so this idea that, like, you were cleared to go out sure, like that, that should be, you know, on a on a chain somewhere to verify that, like, you have a record of consistency. But the like, whether you’re okay or not, that should not be on on the blockchain. So I think that’s a very important distinction there. And the key thing, there was 

Suresh Shankar   

a future, isn’t that where we need to go that we actually say, Listen, I have my identity, I need to access a service. And the service only needs to see a particular signal and not necessarily see all the processing that led to that, you know, the fact that you are okay to do it. Yeah. And I think 

Lucia Gallardo   

the best example, to illustrate that first, someone that might not be, you know, well versed in blockchain is this idea of a bar. My favorite example, so you go to a bar, and what is the legal drinking age in India? 

Suresh Shankar   

Well, I don’t I don’t live in India, but I would suspect it’s at you know, 21 an hour JD. 

Lucia Gallardo   

So, so, 

Suresh Shankar   

I hope I got that, right. 

Lucia Gallardo   

Yeah, well, we’ll be sure. And we’ll use, well in Honduras it’s 18. So, so when I go to a bar, generally, they need some form of ID, right. And you’re handing over, you know, your passport, your driver’s license, your national ID card, or whatever. But this is foundational ID, this is like ID that you use to, like ascertain your rights and responsibilities, and like your citizenry and your like international rights as well. And this is what you’re using to get into a bar. And the bar is like, if you have a driver’s license that has your height and your eye color, and all of these like, you know, really deep information or your address, or you have a document like a passport number, then you know, you’re really exposing this to literally anyone the the bouncer does not need to know all of this information, the bouncer needs to know two things. One is your picture, to see that you are in fact, who you’re supposed to believe you are. And two is they need to know you are over 18, they don’t even need to know your age, they don’t need to know, I’m 32, they need to know I’m over 18. And so those are the two bits of data that they need. So when you design systems with, you know, partial anonymity, pseudonymity when you when you give people options to use usernames and codes, and when you are when it requires full disclosure, like you can play with the design of that, because that’s that is the world that we’re going toward. However, there’s an I love to say this very, very often, there’s no causation or relationship between using a blockchain and having more privacy, there is no causation or relationship between more security or there’s no causation or relationship between, you know, this tech being used responsibly and done ethically, and using this technology, because that has to be very intentional. And who is building this, who’s participating in the build of this? Who’s auditing? Are they using? You know, Are they pulling from open source software? Are they opening up their software? What is it that, you know, what are these processes that people are using to build the technology? Because ultimately, you know, there’s a lot of product identity projects out there that are saying, hey, you know, we’re the next great identity project, but it’s a little bit obscured, or it’s been designed with, like one life experience in mind, which is typically like an American white male that went to college, you know. And so when you sort of start seeing big tech teams that look like that, it becomes very difficult to naturally embed the levels of privacy and, and I guess, like access controls that you would want to see in a system. If you have ever lived in a country where like the government has tried to overreach or if you live, have you ever been in a situation where you were displaced? Or if you’ve ever been in a situation where some component of your identity was actually the source of a disagreement, a discriminatory act against you, right? It becomes a question of who is building our identity systems. Because if you’ve ever experienced anything that made you not want to have be identifiable or to have identity, then then these are the people that should be at the forefront of decision making when it comes to the design of these kinds of systems. And I think that’s super, super important. 

Suresh Shankar   

touched upon such a lovely area. And I think we expanded on that so well, for me, the whole idea is not about whether there’s the technology, but the intentionality behind the technologies. But there’s a second new answer, which is that all people start with great intentions, like Elon Musk is saying about Twitter, and we’ll find out he’s always gonna end up with something because the slippery slope, the moment the data exists as to how it’s being used, and especially when national identity systems are concerned. And this is something that I’ve always had, you know, the thing that he said, is not about going into a bar, you go into an office premises, they ask you for the ID, and I’m like, you’re holding it for so long. And what are you doing with it? Right? It’s like me. And when you have a national ID system, like for India, for example, it’s changed the country with with other when they’re saying 1.3 billion Indians now have a digital ID, which I can just access any service or fingerprint. So it’s Game Changing from an access point of view it democratizes that, and it’s frightening, from what if it gets into the wrong hands? And at least I’m really, really, you know, I could probably do a podcast just on this one topic. Probably. So because this introducing intentionality into technology is probably going to be the single biggest need, as I see it in the next decade to balance of this issue of privacy and access, you know, and kind of trying to balance the trade off between the two. 

Lucia Gallardo   

Yeah, it’s also concerning when you start thinking about like, Okay, what types of services in the decentralized ecosystem kind of government rollout, so, okay, they can do digital ID and then they also do a wallet, because you know, you’re going to have to deal with like, stable coins or central bank based wallets. But now you sort of start to wonder, like, how is this design? Because even if the government can see everything that relates to my identity and identity, verification, and everywhere, I’m going and doing checks for my ID, but then at the same time, they can see my money flows, then that becomes a very interesting precedent, right? Because that 

Suresh Shankar   

is the trick, isn’t it? How can they authorize the service and just provide the authentication without being able to see what happened? Yeah, but how many people want to fix somebody’s gonna get this one, right. And that person is gonna literally change the world like, Tim Berners. Lee did. And because I say this, because what obviously there’s a lot of bad intentional intentionality, even in blockchain, right? I mean, you know, you have Bitcoin and all that you have the wonderful thing of decentralization, but you don’t know what’s happening on the dark web and stuff like that. So I think somewhere along the way, this finding this balance of how somebody can just say, I’m happy to stop the authentication and do nothing more. Yeah, to provide that layer. And unfortunately, needed to be a trusted source like a government. But then again, governments also not trusted. So it’s a big, I think it’s going to be fascinating to see how things play out. Let’s say just one second, I’m going to do one thing, I think we will need to split this into two episodes. So I’m going to do a little bit of an outro and an intro back into the second episode. And then we’ll continue with that because it’s fascinating. Know, it’s about 30 odd minutes on, I think the bigger it becomes longer people may tune out. No worries. And it’s such a fascinating thing to talk about identity, Lucia. But you know, I think we’re going to come back, I know that you’re an expert on migration of people on how cryptocurrency is being used to stop corruption and so many more fascinating topics. So we’ll come back with another video with another episode of this. And meanwhile, to my viewers and listeners, it’s been great to have Lucia don’t go away, she’s gonna come back and tell us how data and AI and blockchain is being used to solve social problems like migrant workers, like corruption, etc. Thanks for listening to this episode of steps to the algo. We’re available on Spotify, Apple, Google, YouTube, and everywhere else you can find us and we will be back with more from Lucia on how data and AI are being used to create social impact. Thank you for listening.