🏀Zerve chosen as NCAA's Agentic Data Platform for 2026 Hackathon·🧮Meet the Zerve Team at Data Decoded London·📈We're hiring — awesome new roles just gone live!
Videos /AI Natives, Privacy, and the Future of Data Science with Ray Mi
X
Data Day Podcasts

AI Natives, Privacy, and the Future of Data Science with Ray Mi

December 16, 2025

AI Natives, Privacy, and the Future of Data Science with Ray Mi

In this livestream, Greg Michaelson will be joined by longtime friend and fellow data scientist Ray M. for a wide-ranging conversation about how AI is reshaping both work and everyday life. They’ll talk about Ray’s path from traditional risk analytics into the startup AI world, how he thinks about data-driven decision making, and what “AI native” students, companies, and tools might look like. Expect a candid discussion about large language models, agentic workflows, education, privacy, and what all of this could mean for the next generation of data scientists. If you’re trying to understand where AI is heading and how to stay relevant, this livestream will be worth tuning into.


  • Hey, hey, hey. [Music]

    0:33

    [Music]

    1:00

    Welcome back to day-to-day with me, Greg Michaelelsson. I'm your host and I'm joined here by Ray Mei, my good friend

    1:06

    from Data Robot Days. He and I worked together at data robot and got to know him really well. Well, he's a fantastic

    1:12

    uh analyst and data scientist and consultant and and researcher and just an all-around great guy to to know. So,

    1:19

    welcome Ry. Thank you. Thank you, Greg. Awesome. Well, I wanted to maybe start

    1:25

    by letting you introduce yourself a bit. Tell us a bit about uh you know, your life and what sort of got you here.

    1:30

    What's what's your origin story? You not not what's on LinkedIn, but if you were a it was a Marvel movie, what would your

    1:36

    orig origin story look like? Sounds good. Yeah. Um hi everyone. My name is Ray. I'm based out of New York.

    1:43

    Uh so I know Greg from my dear Robot days um early like as early as 2018. So

    1:48

    been knowing each other for 7 years. Uh my education background is data scientist. My first job was actually

    1:54

    working as a risk analytics in HSBC for 5 years. Basically helping the bank

    1:59

    capture suspicious activities and bad guys doing bad things in the financial services system. And then my journey

    2:05

    have a pretty big pivot from there. Instead of working in the financial institution in very vertical function

    2:12

    very deep into one specific function the functions in the bank I move over to startup broad. So from there I have like

    2:19

    broader exposure to different um like pretty horizontal industry exposure. Financial services is still my key

    2:26

    focus. Um so in my um AI startup world like data robot is my longest um home.

    2:33

    I've been there for almost six years. Um by the time I left the company I was the VP um RVP of a data science practice

    2:40

    there leading a team of um up to 30 people focusing on kind of helping

    2:45

    customer deliver use cases and drive values and be a trusted adviser with the clients. And then my most recent job was

    2:53

    uh director as a uh director of AI solutions engineer at Snorkel AI uh which is company focusing on data

    2:59

    development and programmatic programmatic labeling uh for data development in order to make better data

    3:06

    for training post training fine-tuning evaluation. Um yeah so

    3:12

    that that's snorkel. Yep. Wow. I never knew what they did. I'm glad to know that. Yep. So uh been there for a year. uh

    3:18

    there a lot of interesting um client that I've been working with there actually so a lot of uh frontier AI labs

    3:25

    that actually open the door to have a conversation with snorkel and that that I have a lot of exposure there about

    3:30

    like how company are doing generative AI how um specific what the specific

    3:35

    requirements are there in order to make their model specialized so yeah that's uh my my background always in the data

    3:42

    science machine learning AI space um focusing on that's not your origin story come on go

    3:48

    back to go back to Mini Ray. Give us your how did you get to how did you get into data?

    3:54

    Okay. What was it that what what was it that kind of drew you to the space and you know back when you were a weiwin?

    3:59

    That's uh yeah that's a very in a great question. So um to be honest I think

    4:04

    this is just happened organically. I don't have like a strong inclination but just naturally happened to me uh when I

    4:12

    was uh younger like I was born and raised in China and came to US about 14

    4:17

    years ago for my grad school and I always you came as an adult. Yeah, I came to the states as an adult.

    4:22

    Yeah. Wow. Okay. After my What was that like? Crazy transition or what? Have you been before?

    4:28

    It was fun. No, I only been to the states the same year. Um, after I get my

    4:35

    um like uh after Ruggers uh accept me as a grad students, I came here for a short

    4:41

    exchange program that but happened the same year in 2011. Before that, I was uh always in China.

    4:48

    Um and I was in Shanghai for four years for my undergrad school. And

    4:54

    what was the most unexpected thing that happened when you got to the US for the first time?

    5:00

    That's what wasn't what you expected. What's Well, I think the food had to be the food, right?

    5:07

    Oh, of course. Yeah, the the American the the American Chinese food is definitely the most attractive things to

    5:12

    me. Kidding. Now, I want to hear I What was your first American meal? Do you remember?

    5:19

    I think first food you ate in America. Something like Panda Express. on that vein. I I I remember it was in the

    5:24

    church cuz there was a church that organized Wait, wait, wait, wait, wait. You So, you're telling me you you moved to

    5:31

    America and your first meal was Chinese food? Yeah. Yeah. Exactly. So, there's a there's a there's a church with a bunch

    5:37

    of u kind of Chinese friends, students, and uh professors and like people

    5:43

    working in the buggers area. They organize uh airport pickup every

    5:49

    semester every like every every August when the new students are coming and the

    5:54

    first thing they get the students to is the church and they have food and accommodation and then give you a kind

    6:01

    of brief welcoming session there. That was uh that was actually that was my first meal. It's kind of buffet food.

    6:07

    You grab your own stuff on the plate. Exactly. Got it. Exactly. So back to your question about

    6:13

    like how I get to data. Uh yeah like mathematics and statistics is always something I'm pretty good at at school.

    6:20

    So this naturally become my um profession and my strength and then when

    6:25

    I'm looking for jobs data scientist is always the kind of the in in the frontier for me to look into and also

    6:32

    for me to get opportunity. So I was hired as a data scientist in a small consulting company for a very short

    6:38

    period of time before I moved to HSBC. And there um I do think like for me data

    6:45

    is the most like honest thing to tell you what's happening right in the world.

    6:51

    Um, as long as you can collect the data, as long you can get the data that represents the specific behavior or

    6:56

    profile and then you can use that to detect some nuances that is not so obvious to um to our human body, human

    7:05

    mind or is even counterintuitive, right? You might think the relationship is one way but the data reveals something

    7:12

    differently but uh but that's kind of the the underlying relationship is actually matters and make better

    7:18

    predictions and better inferences and better description of the of the world. So that's kind of that principles

    7:25

    driving me to still enjoying doing it. Uh regardless whether it's called data science, statistics, machine learning,

    7:31

    AI, the nuance is the same, right? the the the fundamental principle is the same that kind of you have data, you

    7:38

    have a goal to achieve and how do you make the data serving for that purpose for you to achieve that goal um and then

    7:44

    how you can kind of keep improving it based on what you see um that is not performing well in order to improve it

    7:51

    accordingly. So that's kind of this principle is working in my entire data science experience personally. Um but

    7:58

    also that's like one interesting area that keep me still loving it. And that's what you studied at Rutgers.

    8:04

    Yeah. Yep. Data science was that called it was statistics. I think now it's

    8:10

    called data science. Yeah. Yeah. Yeah. Yeah. You got to rebrand. Yeah. You could double your salary if you

    8:16

    become a data scientist for statistician. There are like AI or machine learning majors right now. I I don't know. But

    8:22

    the courses some sound similar to what you know what I found though? I've gone and spoken at a few universities

    8:28

    recently and they're all super hyperfocused on the AI stuff, but the content's not actually getting there.

    8:34

    So, I have this feeling that the universities are a bit behind when it comes to how all of this stuff works and

    8:40

    that the professors are have not sort of like figured it out. they've got their own kind of focuses and and you know

    8:46

    they're kind of keeping on with their research but the students are coming in and they're using these large language models and it's weird right have you had

    8:54

    that experience working with universities at all do you do that much not really my my client base there's not

    8:59

    a lot of high ads in my client base so I don't really have a lot of touch with them I think I do see those like viral

    9:06

    Tik Tok videos about like professors on the in the classroom yelling at students like not use chat GPT or or everything

    9:14

    sounds similar. I I don't think for for that point actually I don't really think

    9:20

    they should. I think some of those are exaggerated, right? So I don't think it's just reality just it's very

    9:28

    they're just completely AI. You can't even tell anymore these things, right? Yeah. Exactly. Exactly. Can you imagine? Cuz 3 years ago, this stuff

    9:34

    didn't exist. And now it's like, wow, suddenly we're suddenly you can't tell a Tik Tok video,

    9:40

    you know, there's like a lion breaking in through somebody's house in, you know, like in Wisconsin and, you know,

    9:46

    there are no lions. It's It's crazy. I I use it actually. I use it a lot. So, um I kind of I hope

    9:52

    it's going as real as possible, but there still still some hints that you can tell. It's not really factually

    9:58

    right or there's some small physical like um

    10:03

    physical phenomenons that is not really well captured by the AI models. But back to the education and like things

    10:10

    university, I am not opposed to like using AI for education. I don't know how

    10:16

    the teachers are teaching AI right now because that's as you mentioned they they get trained. They have their own g

    10:21

    gigs. They have their own uh um tracks and researches. So they might not be as

    10:26

    good as those people in the industry, right? Like working on AI every day. But I think using AI for education is

    10:33

    definitely one area that I that I hope can be more appreciated. Um there's some

    10:38

    conflicts, right? Should students use AI or not? Um but I I went to a lecture or a keynote speech by uh Fe Lee uh three

    10:46

    months ago in um AI4 which is the AI conference in Las Vegas and she was

    10:52

    during that keynote speech she was mentioning like she she hoped like education can be augmented by AI so the

    11:00

    like memory based education can be replaced by something more organic

    11:05

    because right now we spend like 10 12 years remembering things Especially for like I

    11:12

    I'm I'm not a um I don't learn like history or politics a lot in in school, but I do remember I have to recite a lot

    11:20

    of facts, right? Well, that's kind of China, right? Is that is that a Chinese kind of thing? Yeah. Yeah. I learn history.

    11:26

    Maybe that's a stereotype. I I have the feeling that there's a lot of kind of rope memorization

    11:31

    especially for those America. Yeah. Especially for those nontechnical subjects like history or uh like geology

    11:41

    or politics those are there are a lot of like remembering based uh memorizing b

    11:48

    based u learning so for those I hope like we can just search ond and figure out what's going

    11:55

    on and then just from there have more inspirational um string of thoughts on top of that to

    12:02

    make like more rationalization or um kind of get the more more insight and

    12:08

    better better um educated the individuals. So I think that's uh that's something I definitely hope that

    12:14

    education and a high ed as well could augmented by by AI going forward. But

    12:20

    that's kind of a little bit far from my day today. Just hear some interesting talking track from Fei Lee. She's also

    12:26

    grew up in China. So I think that's maybe where I resonate a lot with her talk uh at that moment.

    12:33

    Yeah. Who who who is that? I don't know that person. Fe Lee. So she was the one that cured

    12:38

    the uh data set for imageet and she was uh actually yeah she is uh she was in

    12:45

    Stanford and Google um deep mind I I might be wrong but that's her

    12:52

    background. I'm sure you're right. Yeah I'm just yeah now she's uh she's leading the effort for world AI I think. So she's

    12:59

    trying to kind of have a um virtual world built in AI system. So can more

    13:05

    understand the physical nuances and have a like digital twin of the real world. So uh you can I think there's some image

    13:12

    gener uh video generation um technology developed by the lab as well. It's

    13:17

    pretty realistic um 3D. So um a lot of things can actually happen in virtual

    13:23

    world. This is similar to like what well from my understanding similar to what Nvidia is trying is is doing like I

    13:31

    attended some of the Nvidia conference. I think the key um there there are a lot of other like hardware GPU cells and

    13:38

    promotion on that front. Um but one area that they are really promoting using their GPU is to render a 3D immersive

    13:46

    virtual world that they can use to simulate um the virtual environment. So they can train let's say a Amazon

    13:54

    delivery bot to kind of trial and error different scenarios in order to find the best path instead of using the real

    14:01

    world setup. They can just simulate that in the in the virtual world in order to train the system to be better and then

    14:07

    deploy the fine-tuned um system and logic uh into a real world real world um

    14:14

    operation environment. So that's something I do think so simulation within a simulation.

    14:20

    Yeah. Yeah. Exactly. So you simulate the environment first. That environment is kind of mimicking what real world is and

    14:25

    then the bot is operating in that simulated environment in order to find

    14:30

    so so all this damage and errors would not really make a huge cost, right? Because you can kind of knock down a

    14:36

    rack but with boxes of really fragile stuff, but it's just virtual, right? You don't really break anything in the real

    14:43

    world. And then once the bots figure out how to collaborate, how to kind of yell to each other, how to optimize their

    14:49

    path and avoid objects and kind of find the the most optimal shortcut to things

    14:54

    and deliver things on time, then those bots can be used in real life. And those physical bots are there like I think the

    15:01

    sensors are pretty well defined but the logic how to go from A to B and avoid

    15:07

    hitting things in on the on the way will be something that is hard for them to learn especially complex environment

    15:13

    right like what if the light is off what if there's a there's some object that is

    15:18

    not expected there how do you kind of adopt the environment change so those

    15:24

    are learning experience that can be learned in the virtual environment not the world environment and that's something really fascinating and with

    15:31

    GPU all those like accelerator fun computing definitely that's uh that's what they are promoting from Nvidia's

    15:37

    point of view but I think that's uh definitely one area that I think will be very

    15:43

    powerful for a lot a lot of other different scenarios right not only manufacturer or logistic or supply chain

    15:50

    but also for any other everyday home

    15:56

    robots that could be That could be a a scenario as well. Have you looked at the Tesla robot? All

    16:01

    the hype about the Tesla robot. Uh I saw video clips of it. I don't really dig in dig deeper into it. I

    16:09

    personally I don't like the human-shaped robot. I like something more like a non-root looking.

    16:14

    No, it's got to be human- shaped. Man, I had this argument with my wife. She's like, "We are not getting one of those."

    16:20

    I was like, "It's not up to you. We're getting one of those. I'm going to pre-order it the day that you're possibly can pre-order it." They're

    16:27

    super expensive though. I think they're like 15 or 20,000 or something like that. But man, if you have a robot that'll do

    16:33

    your dishes and fold your laundry, the quality of my life would improve like dramatically,

    16:38

    right? What would you have your robot do? What's your most hated chore? I

    16:44

    I actually don't know actually. Um Yeah, clean the room definitely is the one. Making the bed, like clean the dishes.

    16:50

    Those are the bathroom. Yeah, right. Exactly. Those are fun. Like when I have time, I do actually enjoy doing it, but when that is

    16:57

    competing with other stuff, like then it will be uh it would be annoying. Yeah. Having having something to help me with

    17:04

    that would definitely be helpful. So So you'd have the you'd do the cleaning and let the robot go to work for you. Probably if you want to be a housewife,

    17:12

    a robot wife. That's basically what I am right now. Sounds about right.

    17:17

    Yeah, exactly. So yeah, I Yeah, I have a different view. I I I do prefer the

    17:22

    robot in a different shape though. Uh but man, because we talked like we talked about all the education stuff and I you

    17:29

    know I'm watching like I'm watching uh my daughter do homework and she's got

    17:35

    like this um uh you know they have this thing called Delta Math where it's like on the computer, you know, and it's like

    17:41

    it's really easy for the teacher because the problems are already there and the you know you like solve the problems and then type in the answer and you that

    17:48

    there's no grading, right? So, you know, the the I I've taught too and anything where I

    17:54

    don't actually have to grade, those are tools that I'm going to adopt, right? Because humans are at, you know, at their core lazy uh in a lot of ways. And

    18:01

    so, like this is the lowest common denominator of education, right? The the completely automated form-based

    18:08

    homework. But then what she does uh on like the reg on on a regular basis, if

    18:14

    she doesn't want to work the problem, she's in Chrome because it's on a Chromebook. She just like draws a line

    18:20

    like a box over it and then Gemini will solve the problem in the browser and tell her the answer and she types it in.

    18:26

    Like it's crazy. And the you know it's it's insane. They don't even have to go outside of the the environment where

    18:33

    they're doing the work in order to use AI to solve the problem. But but I really don't know what's going to happen

    18:39

    with education because you know like if I don't actually have

    18:45

    to know how to do anything. Do I care? Like there's going to be a

    18:50

    huge proportion of the population that's not like you that loves to do their chores. That's like well now I don't

    18:57

    have to do my chores and you know it feels like a society collapse kind of

    19:02

    moment. I don't know. That's that's one of the things that I worry about. What do you think?

    19:07

    Um, yeah. Well, I think my experience using AI for everything that I use like

    19:13

    text, image, video, audio, like clone my own voice for some Tik Tok videos. I

    19:18

    think that Wait, wait, what? What? What? To your own voice? I clone my own voice for Tik Tok videos

    19:24

    for like Instagram res. There's a app called There's a um app called 11 Lab.

    19:29

    They're really good at like capture your voice characteristics and then clone

    19:35

    that. So you can type in a paragraph and that that app will like do a voice over.

    19:41

    Yep. Yep. It sounds pretty similar. You can also tune like variability. You can like add some filler words. You can tune

    19:48

    the kind of how aggressive or chill you sound like. So all those is tunable and

    19:55

    uh you just need to record up to 30 seconds of voice and then he will capture those nuances and

    20:01

    do it accordingly. Okay. Well, I know what I'm doing after this. Yeah, I I use that for my um like

    20:06

    webinar or video editing a lot recently. I like if I don't really need to show my face, I just use that AI to voice over

    20:13

    instead of uh talking because that takes time and I can make a video in like less an hour with the image, video, audio all

    20:20

    combined together. Um, but I already forgot your questions. Yeah,

    20:26

    it was the question was we don't actually have Yeah. education. Like if we don't have to know how to do stuff,

    20:33

    what happens when nobody knows how to do stuff? Like what does society collapse? Like I really

    20:39

    think that that's the ultimate outcome here is that nobody has anything to do or knows how to do anything. So

    20:45

    right I think yeah the initiation like the the motivation is the key like what to prompt what do you ask the AI do is

    20:52

    still something that is like the human the most human part right if I don't really prompt all the three tools I

    20:58

    mentioned I I use and they will not collaborate together to make a video for someone right like I think those AIs are

    21:05

    doing something for someone and but human is doing for yourself right for

    21:10

    so we we provide the initiative yep we provide initiative We provide the motivation, the goal and then the AI is

    21:17

    the one that is still the tool to get us there instead of the one driving us

    21:22

    there. Like it will be inspiration. The way I use it is more inspirational like have a general idea and then trying to

    21:29

    figure out what's the different variabilities there in order to find a right path instead of

    21:35

    getting all those idea generated by AI and pick one which is less authentic and

    21:40

    original and I will not really really want to go with those routes instead of

    21:46

    like instead I I prefer to have one idea or a few ideas and like red team with AI

    21:53

    and figure out which which could work, which won't work and from there like prompt the right tool for the wrong

    21:58

    right thing and then get them a goal to achieve my goal and then uh stitch them

    22:05

    together. So there will be basically I am the agent the I'm the agent have

    22:11

    access to different tools those tools tools also our agents have access to other tools but I will be the one that

    22:17

    kind of drive everything orchestrate all those steps together right yeah but so here's what so what I'm

    22:24

    waiting for from Apple cuz you have you have an iPhone because our text our text messages are

    22:29

    blue it's blue uh yeah uh what I'm waiting for is for the iPhone to listen to everything that

    22:35

    I say, which of course it already does, but don't don't tell anybody. Uh, and to read my emails and my text

    22:42

    messages and help me do what the things I need to do. Like, if my phone could send me a notification that says, "Hey,

    22:49

    don't forget to reply to soand so about the thing on Saturday at 2:00 or you

    22:54

    know, like, oh, you're having a I can tell you're having a conversation right now with Ray. You meant to ask him about

    22:59

    this. Don't forget, you know, that kind of stuff." Like, that would be tremendously helpful. I want that. I

    23:04

    would use that completely and it would also like reduce the mental load and all that kind of stuff. Uh but I also I

    23:11

    wonder would it make me stupid? I don't know. Would you use something like that? I will. Yeah, I will. But that's also

    23:17

    like those knowledge I acquired from the past and something actually happened. Right. So it's really hard for Apple to

    23:22

    remind something I haven't really released to the world like it still stay in my mind. That's what I'm saying. It's all on your

    23:28

    phone, right? like you text about it, you email about it, you talk on the phone about it, you have conversations live about

    23:34

    it, you've got your phone with you all the time. There's no reason something like that couldn't exist. Yeah. Yeah, definitely. I agree. So like

    23:40

    remind me doing this and like as a co-pilot definitely that's a that's a good use of AI, I think. Um but there

    23:47

    still be something that like we might inspire each other and realize okay this is a topic I want to talk about or I

    23:53

    haven't really realized oh 10 years ago this happened I want to bring up to this conversation and those are more deeper

    24:00

    memories that it's hard to capture or connect the dots on the level of well

    24:05

    the current level of AI right like because you haven't really um digitized

    24:10

    those long-term memory to a message or email it just happening in the past like before the AI era maybe 10 years or 20

    24:17

    years from now or our current memory can be captured by AI system and that will

    24:23

    be a different story but still like that long horizon long time horizon might be

    24:29

    a challenge for AI to figure out okay that that is something related to this conversation 10 10 years later and let's

    24:35

    let's kind of bring this up to our current current conversation so I think yeah well if AI

    24:41

    can do that that's kind of a human that'd be spectacular I you know that that is the ultimate like I I would use

    24:48

    that every day. But at the same time, you know, I was talking to like my mom the other day and she doesn't use uh

    24:56

    like sleep mode on the iPhone, right? So the sleep focus mode, you know, so all the notifications don't bug you and she

    25:02

    doesn't like to turn her phone on silent. And I'm like, "Mom, there are features on your phone that will make it

    25:07

    work for you the way you want." And she's like, "Eh, I don't want to set it up. It's too complicated. Whatever." And I'm like, "Well, okay. Well, you're in

    25:14

    your 60s and yeah, I get it. Fine. And then, you know, I look at my kids and they're doing stuff on their phones. And

    25:21

    I'm like, h why? Like, you know what I mean? So, there's there's this age thing where we're like, yeah, we still want to

    25:27

    maintain the initiative and, you know, we still want to like drive the AIs and do the red teaming like you talked

    25:33

    about. Is that just because we're old? Like, are the kids going to be, you

    25:38

    know, are they going to say, well, you know, the AI can do this? Yeah, they're AI native, right? they they've born, raised, and grow up in

    25:43

    that environment. So, they know how to deal with it. Like we I think yeah, I I think that's that

    25:50

    that's true that AI doesn't exist like 10 years ago, right? When we first you

    25:56

    like working on data science and this is doesn't doesn't exist this level of AI, right? We also call some other stuff AI

    26:03

    as well back then. Um but yeah, they they grow up in that environment. They know much better. They similar to

    26:09

    companies I think like you see companies like born and raised in from like 2022

    26:15

    onwards they are much more AI native cloud native digital native they just

    26:21

    they create from that with that DNA and it's don't don't really need to pivot to something they don't need to figure out

    26:28

    the business strategy from a traditional ML to to AI native business model they

    26:33

    born raised in that era and they would they just kind of they can they can write the the the tide much efficient

    26:40

    compared to some companies with a more traditional setup with a like old

    26:45

    version infrastructure and ideologies and like the way to go to market. It's harder to pivot when the ship is heavy

    26:53

    and hard to Oh, I don't know why my my TV is talking back to you. My my TV is talking back to me. I don't

    27:00

    know why. Okay, that that very timely topic. See? Um, and that never happened.

    27:07

    That never happened. Maybe something I mentioned like AI AI AI triggered the It's LG TV, so it's not Siri per se, but

    27:13

    the Oh, it was LG. Mhm. Oh, it must have fancy TV. It's LG TV. The remote has is a

    27:19

    microphone there. So sometime I use it for search like search this this YouTube videos.

    27:25

    I never know this AI assistant functionality. Um, always listening. Yeah, exactly.

    27:30

    Actually, a really interesting question. Yeah. Um, I find that the younger people are, the less interested they are in

    27:38

    digital privacy. Do you find that to be the case? Because like my mom won't use Tik Tok

    27:43

    cuz she's afraid that the Chinese are watching her. Mhm. And I'm like, "Mom, nobody cares what you watch on Tik Tok.

    27:49

    Just just scroll. Just doom scroll. It's fun, right?" But there's no there's no teenager alive that's like, "Oh, my my

    27:56

    data." You know what I Actually, the the YouTube I'm watching right now, just a random popup is like why Gen Z is more

    28:02

    has a more favorable favorable score to China compared to the other generations

    28:08

    in the Western world. Basically, a favorable score. What do you mean? Yeah. Like more pro. It's like

    28:14

    Oh, I see. They're more in favor less less suspicious of China. Yeah. Yeah. Exactly. Exactly. So, I think that's uh Well, I haven't watched

    28:20

    that, but that's I just turn it turn like open it and want to see like why Jenz has a different perspective. Maybe

    28:27

    that's why right like this they're some of stereotypes are broken for well or

    28:32

    they just grow up in that they cannot really they they sacrifice some of the privacy privacy concerns for for

    28:38

    convenience um and they don't really see well I don't know how big a pain that

    28:43

    could be but they don't really experience the pain or challenges from like privacy point of view but the

    28:49

    convenience is much more that that adrenaline driven by those contents and the colorful stuff on the screen is way

    28:56

    like very overweighing all the other concerns or draw potential drawbacks by

    29:01

    using those applications. So yeah, and me too like I kind of do you

    29:07

    want to grant access to this? Yes, sure. I want to use the tool. Exactly.

    29:12

    And it's not really something you can customize, right? It's kind of a a lump sum of everything. You have to agree or

    29:18

    disagree. You have to take the good with the bad. Yeah. Yeah. Exactly. Okay. So, I have to kind of because my goal is to use it and

    29:24

    share some some of the privacy. So, and I guess it's not really hide like it's

    29:30

    it's hard to hide um in the digital world like you can but if you want to

    29:36

    use Instagram it will popping up all those like tailored advertising

    29:41

    advertisement based on your Google search result and based on other other browsers activities or emails you have.

    29:49

    So I can block some but it's really hard to for me to sit down and like let me figure out how they're interconnected

    29:54

    interconnected and block all the all the path. So Instagram will never know what

    30:00

    what's going what I'm doing on my browser on my Google account on my email. Uh I I don't really have that interest.

    30:06

    I don't even have that motivation to do. I think it's fine because that's all my behavior right. Um,

    30:14

    I I had a friend that would uh they would plant misinformation about themselves like they would, you know, on their

    30:20

    Facebook it was he was a 72y old woman and on Instagram, you know what I mean?

    30:26

    They so his his plan was to hide in the noise of his own misinformation. Uh but I

    30:32

    don't know. It seems like a lot of work to me. Yeah. Yeah. Exactly. Like why? Right. like what what maybe for people with a

    30:39

    with a very unique very unique thing they want to hide that's fine but I I

    30:46

    think I'm more open to we can let the internet know who I am and then they can recommend me some good product I'm fine

    30:52

    with it. Do you think it's a pendulum? Like, will something happen in 10 years

    30:57

    where there's like a major violation of people's privacy that'll cause all these young people to suddenly become aware

    31:04

    and the pendulum will swing the other way and everybody will be deny deny deny or do you think it's more sort of a

    31:09

    natural evolution of the way things are going? I would say latter, right? Natural revolution. But we are kind of good at

    31:17

    figuring out how to deal with it, I think. Well, not not to the extreme

    31:22

    extent, but we kind of cope with what the environment is changing and adopt accordingly, right? And and kids with

    31:30

    younger younger generation with better tools and like things that is much easier to access, I guess they will

    31:37

    figure out how to adopt much quicker. Um well because like if you think about uh

    31:42

    the the idea of freedom versus security from a like societal perspective like

    31:48

    after the 911 attacks people were way more interested in security than they were in freedom. And so you see things

    31:54

    like the Patriot Act and stuff like that and and now it started to swing the other way right so every time something bad happens people want security over

    32:01

    freedom and you know then things are good and it swings back and people are like freedom freedom you know what I

    32:06

    mean? So it's like it's like a pendulum. It's kind of the same with safety versus privacy. I think maybe like once you

    32:12

    have your data breached or or something like that, then maybe you start to worry a bit more about privacy.

    32:18

    Yeah, I agree. Yeah, this is something Yeah. is always there like how do we

    32:23

    regulate that? How do you control those privacy is important topic. It's more

    32:28

    from a like B2B level to B level conversation

    32:34

    like how those company are well it's more from regulation like how government is government is regulating AI system

    32:40

    what kind of rules and um like and and requirements are there to make

    32:46

    sure things are not breached more not not shared without without authentication

    32:52

    and those big giants should like do their work to make sure things are not um

    32:58

    leaked um to some to to the extreme extent. I think data are always shared or sold uh for commercial purposes. Um

    33:07

    well like I learn I learned when I started working at Travelers that every time you take your car into the shop,

    33:12

    they capture your odometer reading and a few other things like that in a database somewhere that the insurance companies

    33:18

    buy uh without your knowledge. And that's like okay interesting. There are

    33:23

    data sources out there that are capturing data about you that you know absolutely nothing about.

    33:29

    Yeah. I didn't even know they were capturing that data before then. Yep. It's uh it's happening. But have

    33:34

    you watched the uh Netflix series um Netflix movie uh social dilemma?

    33:39

    That one was like few years ago. So basically how documentary or a movie? It is a movie uh

    33:45

    but more like a um sci-fi movie talking about they basically human uh uh create

    33:52

    some human characters for the algorithm in the back ends like basically how have you watched the inside out that that

    33:59

    movie. Yes. Yeah. So basically they they put all the algorithm and the uh functions back in

    34:05

    the back end of Instagram for example into human like how do you tweet tweak the dials for this person to get more

    34:11

    extreme right get those feed all the time once you figure out this is something they're interested so that was

    34:17

    this this theory is there for a long time all the all the preference like recommendation um system is doing that

    34:24

    way basically it just visualized into a movie to showcase from like two individual in the real world, how they

    34:30

    get like from something more someone more moderated to extreme uh left and

    34:36

    extreme right based on what they see in fed in social media in the in the and

    34:41

    they they kind of unboxed what's in that social what's in that AI algorithm for this recommendation into human. So um

    34:49

    that's that one was very thoughtprovoking. I thought a lot of people kind of realize okay so a lot of

    34:54

    things that we see as true is just what we received is very partial and biased

    35:00

    right um but I I hope there are things for this new era of generative AI and

    35:07

    more advanced AI system that people realize okay there's also a some

    35:13

    invisible hand in the back end that kind of manipulating the way that you're interacting with um with others and also

    35:21

    consume your data for different purposes, right? Uh some of those are good, some of those are helpful and um

    35:28

    very beneficial for you, but some of those are not. Do you think that there is um uh so

    35:37

    certainly it's the case that like a Tik Tok and Instagram are are using the algorithm to keep you keep you looking,

    35:44

    keep you scrolling, right? Mhm. But do you think that there is a uh a

    35:49

    more nefarious thing at work there that where the algorithms are being engineered to try and change the way

    35:55

    people think about certain things? Do you think that's happening at all? Um

    36:01

    I don't There certainly been studies. Sorry, I'll let you answer. But there have

    36:06

    certainly been studies that show that it's easy to influence undecided people uh by showing them

    36:14

    content that I agree. Yeah. Well, I sometimes feel unconvinced by something or at

    36:20

    least that short period of time I would get so much like uh reinforced by the similar amount of similar information. I

    36:27

    would think oh this is what happened or this is what's true. But then time matters like if you well time can work

    36:35

    in a good way or bad way. If you keep watching those same contents you get like 100% convinced this is true. But

    36:42

    what I usually will just try different platform and see like for that specific topic what's going on and trying to kind

    36:50

    of map out my own picture of it and trying to not make conclusion too quick.

    36:55

    Sometimes I kind of argue with my friends when they see a headline and they just think that's true and then

    37:01

    start to be worried or be outrageous about it and was like is it true? So

    37:06

    always like question it first and make sure it's like it just one source right that that source can be biased or even

    37:12

    neutral. you should kind of reach out to extend the the boundary and figure out what's possible reality of that story

    37:20

    because all the story as long as you are not part of it or like can have a kind of god's eye to see all the elements is

    37:27

    always a partial view like have you watched wicked which is a hot movie right now it just came out the second

    37:34

    one wicket I tried to watch it twice I keep falling asleep it is is yeah it is not as good as first

    37:39

    one um no I tried to watch the first one twice Oh, the first Okay. I I don't know. I saw it in the theater

    37:46

    live and but yeah, I couldn't I can't make it through it. Yeah. Like I do keep trying though.

    37:51

    Yeah. The Wicked and like with with Wizard of Oz, they're the same story. Like they they coexist, but they are

    37:57

    quite different perspective, right? Like Yeah. Like if you if you only read the New York Times, you're going to have a liberal view of how things are. Whereas

    38:04

    if you read uh the Wall Street Journal, say, you might have a more conservative view. Yeah. Exactly. So that's what I'm trying

    38:10

    to trying to always ask a question like is that true and then yeah but you're a curious person you

    38:17

    know you're a curious and intellectual and insightful and intelligent person whereas your average person maybe less

    38:23

    so yeah well that well thank you but and know but yeah I think that's just my my

    38:30

    way of doing things and like if I can convince any anyone to do that I will right that's that's why like it's

    38:36

    conversation is important like webinar like a a podcast like this is important

    38:41

    like the more people can hear others voice and learn some perspective from the others the better people can be more

    38:48

    um curious or challenging to what they received and hopefully that will kind of

    38:55

    make this uh less severe right when AI could create that like mind trap to trap

    39:01

    you into something you believe true I heard the most interesting analogy the other day uh I think I was listening to

    39:07

    Joe Rogan uh and he was talking about how uh

    39:12

    phones and social media are like a drug that we don't really know how to treat

    39:17

    an addiction to yet. Uh and how in the future, you know, in in 20 years or something, someone will

    39:23

    have grown up with it and they'll know, oh, I have this abuse problem with Tik Tok or whatever the platform will be and

    39:30

    so I sort of have to monitor my usage of it and, you know, make sure that it doesn't consume me and stuff like that.

    39:36

    I've already sort of seen that in my feed. Like if I see like a Judge Judy clip, you know, Judge Judy, those things

    39:42

    are so interesting. But if you watch two in a row, your feed will be filled with them. And so whenever I see one, I swipe

    39:49

    right away because I don't want it, you know, like I don't want that to consume my life just because I was interested in

    39:54

    this one Judge Judy case and suddenly that's now all I get. Yeah. Right. I I actually don't I'm I'm

    40:00

    not I I'm okay with this with this fact. Like similar analogy is go my my goi

    40:07

    like I saw some memes about like human is basically well oxygen which is

    40:13

    essential for your for everyone is basically toxin for you for like 80 years 90 years your body get oxidi

    40:20

    oxidized in the in the environment for 80 years and then gone if there's no

    40:25

    oxygen to kind of t attack all your DNAs your life can be much more elongated so

    40:30

    you kind of soak into this toxic toxic environment with oxygen which is essential for life for their entire

    40:37

    lifetime which is like to to us like electricity right is water like those

    40:42

    are those are could be harmful like water not not water electricity phones

    40:49

    internet those are essentials that could do something harmful but you can you

    40:54

    don't really need to get rid of it because that's just the the environment that we co exist with and then it's up

    41:01

    to like anything in life you have to sort of like moderate and balance and exactly

    41:06

    choose choose the life that you want to live right yeah that's addiction for sure but like there are also other elements that

    41:13

    is addictive that the human are addictive to and we don't really get need to get rid of right completely just

    41:20

    like moderate and yeah yeah definitely not suggesting getting rid of my phone I'm way too addicted

    41:26

    right exactly I'm always like looking to get the new one that have more AI functionalities

    41:32

    Yeah. Although I mean not a lot has changed on the iPhone. I've actually been really frustrated that there's

    41:37

    hasn't been any new features come out in such a like they put a new button on it and the button does is so annoying

    41:43

    and barely works. I mean this one right the one the camera the camera button. Yeah.

    41:48

    Yeah. Do I I can't it's just easy access to camera when your hand is like not not

    41:54

    not with a glove on for example. Right. in the winter time. But other than that, it's just well I

    42:02

    actually I hope it's not having significant change on the look on the appearance. I'm so used to

    42:08

    I don't know. Well, I you know function wise I I hope we can have better AI as you mentioned like I hope

    42:13

    iPhone know better of me how I use the how I use my phone and that behavior is

    42:18

    a big amount of uh data for iPhone to make some sort of a system copilot or clone of myself.

    42:26

    Yeah. Yeah. Yeah. I I feel like privacy concerns are holding people back on that

    42:33

    developers like the company's back. But I'm completely ready. Take take my data. Just give me an AI assistant that

    42:39

    actually knows everything about me and I'll be happy. But the the biggest problem with that because I've got a couple of devices

    42:45

    that do that. Like there are a few startups that have created hardware that you wear and they like listen and all

    42:51

    that sort of thing, but they can't distinguish TV from not TV.

    42:57

    So, you know, they get they start to mix reality and and television world and their recommendations. So, like I was

    43:04

    watching um Vampire Diaries with my wife and uh it started talking about witches

    43:10

    and demons and vampires and stuff in my daily summary. It became very weird like

    43:15

    very sort of bizarre like my AI assistant had a mental break or something. But anyway, let's get back to

    43:21

    data science. cuz we only have a couple of minutes left, but I really would love to hear your views on like where are we

    43:26

    going from uh given all this AI stuff and how you know how it's not really

    43:32

    important to learn to code anymore because the computers can code for you but still programmatic thinking is

    43:37

    really important and being able to kind of tie everything together and and use everything and understand what it is and

    43:43

    all that sort of thing. What what does a data scientist job look like in five years, 10 years? you know, would you

    43:50

    advise your kids to go and study data science or would you point them down a different path given what's going on?

    43:57

    Yeah, that's a great question. I I don't think to be honest, I don't really even though I'm a data scientist in my work

    44:02

    in the past two years even like in data robot life, it's not that much about the

    44:07

    scientific part of itself. It's more engineering and architect. Um so if I

    44:13

    see like how this from my from my point of view the paradigm is changing from

    44:18

    like predictive model which is more traditional ML forecasting basically

    44:23

    predict what's the next number or what's the next category to generative AI which is more simple prompting summarization

    44:31

    creation and then now everyone is talking about agentic AI right so it's

    44:36

    um it's more uh advanced sophist sophisticated have more access to tools

    44:42

    that can be used as a um to solve more complex problems, right? And all those

    44:48

    it just the the in enhancement or the evolvement is the the algorithm

    44:54

    underneath that can do a better job in very specific tasks, right? But how

    44:59

    those tasks are stitched together, it still requires some scientific mind to kind of architect this process together

    45:06

    to make sure you have a system that can uh not only working in the testing environment but working at scale in the

    45:13

    production environment because that's how and enterprises are using those generative AI system or agentic system

    45:20

    and um and then there are data science best practices there still um if we pick

    45:25

    one example like when open AI is training chat GPT the reinforcement

    45:31

    learning part with human feedback they're kind of labeling preference for um for the outcome from the from the

    45:38

    reward model and they they use very interesting statistical practices like to resample from those pairs to kind of

    45:46

    augment the amount of data points they can have by one experiment they can extrapolate six data points because it's

    45:52

    like two out of four they can have four to six different combinations there um

    45:58

    when they are actually using this data. So those are kind of experiment experimentation design principles that

    46:04

    is still statistics related right data science related but it can be serving for a um different purposes to train

    46:11

    those deep learning and large language model um and one other aspect is like

    46:16

    evaluation I think because that's one area that I think is important for

    46:22

    agentic system especially because there's so many parts involved in this orchestration like different tool

    46:29

    callings different user interactions and different like agent task triaging. So

    46:35

    how each step is working at accuracy will determine how the final output will

    46:40

    look like. Right? So how do you evaluate not only the end product but also each steps and then chain them together.

    46:47

    There will be um there will be necessary like statistical or data science

    46:52

    principles needed in order to design that carefully and then set up different evaluation criteras and all those are um

    47:01

    related to what even what we were doing back in that that robot like how do you monitor and maintain your system

    47:07

    performance? How do you um validate the performance before putting model in production? Right? All those are still

    47:15

    there. they might not really need to learn all the I enjoy learning the fundamentals of like regressions or um

    47:22

    the design of experimentation or inferences right like um but I don't

    47:29

    know how much it's still required for more fundamental research but for

    47:34

    applied one like knowing that or through the AI system is important um but better

    47:41

    to know how to initiate the different new ideas and call the right tools with

    47:46

    the help of AI is how I see this is evolving over time. Um but from the

    47:52

    industry research point of view I I learned like a lot of most of the companies like 95% or plus companies are

    47:59

    already adopting generative AI or a gintech system but only like 13 14% are

    48:04

    actually claiming they have um tangible return from those from those models. So

    48:10

    still in a very early stage right. So how do we lift that from like 15% um

    48:16

    value creation 15% of those use case actually creating value to much larger number still require a lot of um a lot

    48:23

    of uh mancraft from a engineer AI archite art um architect and also data

    48:29

    scientist to work together to figure out like not only using it but also using to

    48:35

    generate the value and doing what this system is supposed to do and generate

    48:42

    more benefits. for human in general, right? Got it. Well, fascinating. Uh,

    48:48

    last question. I I I'm curious about uh how you are using data in your life

    48:54

    personally. Uh I'm kind of a nerd. Uh and so I always like to dig into the different types of data that people are

    49:00

    thinking about in terms of like, you know, day-to-day, right? So I track my sleep. I look at, you know, like I look

    49:06

    at the rings on my Apple Watch. I'm looking at, you know, like my uh health

    49:11

    statistics, you know, as I as I do, you know, whatever it is I'm doing. Know, those kinds of things are interesting to

    49:16

    me. Uh what kinds of data do you use on a day-to-day basis in your life that uh

    49:23

    that you know are important to you and that you're tracking over time? Yeah. Yeah. So, when I was running more

    49:29

    extensively, so I stopped it right now, shamefully, but I was running pretty extensively for like

    49:34

    too cold, man. I know. But it's like this in in general this year I didn't run much but like in

    49:40

    the past three years I was running a lot like including four or five marathons a lot of half marathons. Yeah. So I use

    49:47

    app called Straa which is a pretty popular app in the runners community and so I track data there. I have uh they

    49:53

    have a phone app which is connecting with your watch to collect data but also they have a web browser a desktop

    50:00

    version that you can get more granular data from it. So I just use that data to track how I'm improving where I can

    50:06

    because there will be a lot of variables right like where you are running what's the temperature what's the weather that

    50:11

    day um what's your uh fatigue score from before those are kind of leading

    50:17

    indicators they can use to um make some predictions on how you can kind of train

    50:24

    up based on this the the based on the uh training you have so I don't really have

    50:29

    like a lot of like models in the background but just more using my eyeball checking trying to understand

    50:35

    what's the trend look like. So that's one that's one area. Um I do use data a lot for my engagement on

    50:41

    social media like because they they do provide insights and like how many views

    50:46

    and likes and comments and that related to a few things that I based on my data

    50:53

    uh analytics um intu intuition. I also don't have a kind of model for that,

    50:58

    right? Time matters like when do you do you post the um the uh do you post it on

    51:05

    social media? Um the keywords, the hashtags you add there, music choices,

    51:11

    right? And also um I think that's pretty much and then there are other hints that

    51:17

    you can like try try trial and error as well. So those are different like aspects that you can I will observe and

    51:25

    see okay how how do I get more views from people right and uh not only for my

    51:30

    own post but also yeah but also my uh I have a kind of podcast and webinar as well uh for my own purposes

    51:37

    what's it called promote it it's uh well it's not related to data science at all so we can

    51:44

    it's more like you don't have to be it's like a personal lifestyle type of uh type of uh um

    51:50

    type um webinar for my community. Um but that one also like I do want views and

    51:56

    um not only for views for awareness and visibility for people to know exists, right? So the

    52:02

    more I can outreach to the more I can relay my

    52:07

    like deliver my message to the people that I have outreach to, right? So not trying to kind of

    52:13

    saturate their mind with my ideas but give them a different perspective. So to avoid people getting too overly biased

    52:20

    opinion by with the help of AI basically. Yeah. So those are those are different

    52:26

    different areas I use my data for um for my dayto-day. Gotcha.

    52:31

    And you mentioned you mentioned Strava and you mentioned 11 Labs. Uh any other apps you can't live

    52:37

    without? What's the what's the number one app you can't live without? Well I think let me see. Well

    52:44

    is the number one app I couldn't live without. Yeah. That's kind of a common for everyone I guess. Um I do use cup

    52:52

    cap cut to for video editing. Uh and there are a lot of uh AI uh AI tools

    52:58

    there. One of which is uh V3 the one from Google for video generation. That is really good. Um Google also announced

    53:06

    their own nano banana as part of the Gemini 3 which is also pretty good. Uh

    53:12

    Midjourney I like Midjourney from day one. So I was using the journey from 2023

    53:17

    when I start to create some um blog post for generative AI. So they are really

    53:22

    good. They're creative. They're not very guidable. You just kind of you have

    53:29

    to give it it is what it is. Yeah. Yeah. It is what it is. But some of those images are really creative and

    53:34

    high resolution. So I prefer using them over the the images generated by CHBT by

    53:40

    um by Chad JBT or or um or Nano Banana.

    53:45

    Uh what else? 11 Lab that's something I start to explore which is um a tool that

    53:50

    for uh voice generation text to voice or text clone or voice to text like

    53:56

    everything audio related is super helpful. I'm still exploring others. I think

    54:01

    those are the few I use a lot. I tried Clawude at the for for for a period of

    54:06

    time. I I like them in a way that they're creating they're good at coding even when you're prompting for like

    54:12

    create a website or create a um create a um a PowerPoint. It will just use HTML

    54:19

    and C uh CSS to kind of um build the website for you using code and you can

    54:25

    see the code view versus the preview with the graphs and colors on the screen. So that's super good, but not as

    54:33

    creative as you can imagine like what Google and ChadBt can do, right?

    54:38

    Yeah. So I've been I'm I'm I'm a signed up like um as have prescriptions for all

    54:45

    the tools I mentioned just now, which is a big deal. I need to consolidate that to some extent. I heard there's some I

    54:50

    forgot the website called you can kind of sign up for one like pay a lump sum

    54:56

    like 20 bucks per month and you can use all the tools but they have a bigger contract in the back end so you kind of

    55:02

    consume credit um by the fraction you're using instead of you have to subscribe

    55:07

    20 bucks for each of those applications because you already use that amount of well I at least my daily interaction

    55:14

    with those AI tools I won't really consume that amount of compute power you need a fractional subscription Yeah.

    55:19

    Yeah. Exactly. And you like a pay uh uh pay as you go mode. Nice. Awesome, Ray. Well, thank you so

    55:26

    much for taking the time. This was really fun. Uh we, you know, we didn't really focus hugely on data sciences, but we talked a lot about interesting

    55:32

    stuff and AI and all the things that are going on. Before we sign off, anything that you want to promote? You want to give your Instagram or anything like

    55:38

    that for people to follow you and kind of see what you're up to? Yeah. Well, maybe following my my

    55:44

    LinkedIn. Uh my LinkedIn is you can search Rayme and I think the first one pop up will be will be me. Um yeah and

    55:52

    uh open open for job opportunities if anyone have good opportunities for like leadership pro in the pre-sale sales

    55:58

    engineer or machine learning um experts. I'm happy to explore and have a

    56:04

    conversation. Nice. All right. Cool. Thanks Ray. Appreciate your time. See you later. Thank you so much Gr. Thanks. Bye

    56:11

    bye. [Music]

Related Videos

Decision-grade data work

Explore, analyze and deploy your first project in minutes