
SHOW NOTES
In this fascinating episode, Dr. Kelly Bronson, reveals how Big Tech companies use AI and big data to gain, maintain, and wield their power, especially within the field of agriculture (pun intended!). The conversation, set during a December 2024 snowstorm in Ottawa, teases out some of the ethical, moral, social, and financial implications of these technologies in order to ask: what are we even doing?!? We go deep into the framing of expertise, what it means to be a social researcher, and how to distinguish between evidence, claims, opinion, facts, and ideology, in order to move our understanding forward and bring light to the unseen. In this way, Dr. Bronson led Phyllis to multiple outburts of “what?!?” “no way?!?!” “Oh I get it now!!!” “Oh wow!!!!” Listen yourself and prepare to have your mind blown!
GUEST BIO
Dr. Kelly Bronson holds the Canada Research Chair (Tier 2) in Science and Society at the University of Ottawa where she is also an Associate Professor in the School of Sociological and Anthropological Studies and a core member of the Institute for Science, Society and Policy. Before joining the University of Ottawa, Dr. Bronson directed the Science and Technology Studies program at St. Thomas University in New Brunswick.
As a social scientist with a background in biology, she examines the societal and ethical implications of controversial technologies, ranging from genetically modified organisms (GMOs) to big data and artificial intelligence (AI). Her research aims to integrate community values into evidence-based decision-making regarding technological governance. Dr. Bronson’s academic contributions have been featured in both national and international journals, including the Canadian Journal of Communication and Big Data and Society. Her book, The Immaculate Conception of Data: Agribusiness, Activists, and Their Shared Politics of the Future (2022-McGill-Queen’s UP) critically explores the intersection of data, power, and agricultural politics. Her work has earned numerous research awards and she frequently advises on technical policy decisions, including impact assessments, for various public and private organizations.
Dr. Bronson holds a PhD in Communication and Cultural Studies from York University (Toronto, ON), a Master’s in the Sociology of Biotechnology from the University of Saskatchewan (Saskatoon), and a BScH in Environmental Biology from Queen’s University (Kingston, ON).
Errata
Phyllis suggested at least once that big tech folks are not inherently evil and she now takes that back. I mean, maybe #notalloligarchs but let’s not normalize the evil that is happening right now.
Works Cited
Arendt, Hannah. 1958. The Human Condition. Chicago: University of Chicago Press.
Arendt, Hannah. 1973. The Origins of Totalitarianism. New York :Harcourt Brace.
Benjamin, Ruha. 2019. Race After Technology: Abolitionist Tools for the New Jim Code. UK: Polity Press.
Birch, Kean, Cochrane, DT, & Ward, Callum. 2021. Data as asset? The measurement, governance, and valuation of digital personal data by Big Tech. Big Data & Society, 8(1).
Birch, Kean, and D. T. Cochrane. 2021. “Big Tech: Four Emerging Forms of Digital Rentiership.” Science as Culture 31 (1): 44–58.
Bogost, Ian.
Bronson, Kelly. 2022. The Immaculate Conception of Data: Agribusiness, Activists, and Their Shared Politics of the Future. Montreal: McGill-Queen’s University Press.
Bronson, Kelly, and Irena, Knezevic. 2016. “Big Data in Food and Agriculture.” Big Data & Society 3(1).
Bourdieu, Pierre. 1977. Outline of a Theory of Practice.
Carson, Rachel. 1962. Silent Spring. Greenwich, CT: Fawcett Publications
Clapp, Jennifer. 2005. “The Political Economy of Food Aid in an Era of Agricultural Biotechnology.” Global Governance 11(4):467–85.
Coleman, James Samuel & Farraro, Thomas J. 1992. “Rational Choice Theory: Advocacy and Critique.” SAGE Publications.
Crawford, Kate. Atlas of AI : Power, Politics, and the Planetary Costs of Artificial Intelligence. New Haven, CT: Yale University Press, 2021.
Greene, Daniel, Anna Lauren Hoffmann, and Luke Stark. 2019. “Better, Nicer, Clearer, Fairer: A Critical Assessment of the Movement for Ethical Artificial Intelligence and Machine Learning.” Proceedings of the 52nd Hawaii International Conference on System Sciences, 2122–2131.
Hall, Stuart. 1982. “The Rediscovery of ‘ideology’; Return of the Repressed in Media Studies.” in Culture, Society and the Media, edited by Tony Bennett, James Curran, Michael Gurevitch, Janet Wollacott . New York: Routledge.
Monsanto Canada Inc. v. Schmeiser, [2004] 1 S.C.R. 902 (Supreme Court of Canada).
Morning, Ann. 2011. The Nature of Race: How Scientists Think and Teach about Human Difference. Berkeley, CA: University of California Press.
Porter, Theodore M. 1995. Trust in Numbers: The Pursuit of Objectivity in Science and Public Life. Princeton, NJ: Princeton University Press.
Zuboff, Shoshana. 2019. The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power. New York: PublicAffairs
DSR mentions or related posts
Qualitative vs (?) Quantitative
Researching Writing & Portmanteau Words with Dr. Maggie Werner – Doing Social Research
What is Science? – Doing Social Research
Transcript
00:00:01 Speaker 1
Hello and welcome to the doing Social Research podcast where I talk with some of my favorite people who do Social Research to dig into the cool projects they’re working on.
00:00:10 Speaker 1
My goal is to help demystify research for students, inspire other researchers, and provide platform for all the brilliant work of folks doing research in the humanities and social sciences.
00:00:16
And.
00:00:20 Speaker 1
I’m your host, Phyllis Rippey, a professor of sociology, the University of Ottawa and creator of the website doingsocialresearch.com.
00:00:27 Speaker 1
But today we are not here to talk about me.
00:00:29 Speaker 1
We’re here to talk with a brilliant and amazing doctor, Kelly Bronson.
00:00:33 Speaker 1
Doctor Bronson is an associate professor at the School of Sociological and Anthropological Studies at the University of Ottawa, where she also holds the prestigious position of Canada Research Chair tier 2 in science and society.
00:00:43 Speaker 1
As a social scientist, Doctor Bronson focuses on the complex interactions between science, technology, and society, particularly around controversial technology such as GMOs, fracking, big data and AI, which I put this in the notes like 100 times.
00:00:57 Speaker 1
But I’m especially excited about the AI stuff because I keep asking everyone in.
00:01:00 Speaker 1
You’re the first person who’s actually an expert on it, but I’m so very excited about that.
00:01:05 Speaker 1
Her research aims to bridge the gap between technical knowledge and community values, fostering evidence based decision making that incorporates diverse perspectives.
00:01:12
No.
00:01:13 Speaker 1
Doctor Bronson has published extensively in regional, national and international journals, and her work has significantly contributed to understanding the sociopolitical dimensions of emergent technologies.
00:01:22 Speaker 1
Before joining the University of Ottawa, doctor Bronson directed the Science and Technology Studies program at Saint Thomas University of New Brunswick.
00:01:29 Speaker 1
She holds a PhD in communication and cultural studies from York University and a master’s degree in Science and technology from the University of Saskatchewan.
00:01:36 Speaker 1
Additionally, she has a background in biology, having worked as a genetics and plant biology lab scientist at Queens University.
00:01:43 Speaker 1
Doctor Bronson has actively involved in various advisory roles on large research grants and sits on the editorial boards of several academic journals.
00:01:50 Speaker 1
She also leads the Canadian Network for Science and Democracy, collaborating with international counterparts to to promote responsible innovation, and one of my favorite things about Kelly is that despite all of her brilliance and all of this fancy stuff, she’s just amazing.
00:02:04 Speaker 1
Talked about the realities of combining this kind of fancy.
00:02:07 Speaker 1
Work with the realities of being a normal human being.
00:02:10 Speaker 1
Faced with the kinds of neuroses, fears, stresses, strains of life of the modern world that we all have to deal with, and so to borrow from RuPaul, you give not just professional professorial realness, but also like really real realness.
00:02:21
It’s.
00:02:23 Speaker 1
So anyway, I’m very excited to dig into your work and to chat about it, especially the AI stuff, as I said.
00:02:29 Speaker 1
But yeah, just to welcome to my podcast.
00:02:32 Speaker 2
Thank you.
00:02:33 Speaker 2
Thanks, Phyllis.
00:02:34 Speaker 2
I’m so happy to be here and I feel like we should.
00:02:38 Speaker 2
Take a quick sidebar and justice set the stage a little bit because this is very Ottawa.
00:02:43 Speaker 2
We’re both here.
00:02:44 Speaker 1
Oh my God.
00:02:45 Speaker 2
Yeah, like physically present.
00:02:47 Speaker 2
And you know, for listeners, they can’t see behind you.
00:02:50 Speaker 2
But we’re looking at a wall of snow.
00:02:52 Speaker 1
Oh my God, it is.
00:02:52 Speaker 2
There’s no visibility.
00:02:53 Speaker 2
There’s such a huge epic snowstorm or Blizzard happening outside your office window.
00:02:58 Speaker 2
Outside place of work, it looks like we may be trapped here.
00:02:58 Speaker 1
I.
00:03:00 Speaker 1
5 almost.
00:03:01 Speaker 2
We may be stuck here talking about AI and agriculture for days.
00:03:04 Speaker 1
I know.
00:03:05 Speaker 1
I feel like I hope.
00:03:06 Speaker 1
I don’t know what what?
00:03:07 Speaker 1
Maybe it’s like our job.
00:03:08 Speaker 1
Is to like clear the fog.
00:03:10 Speaker 2
And that’s maybe, maybe not just metaphorically as we as we talk, he the sky will part and also we both accidentally worn hot pink.
00:03:10 Speaker 1
Yes.
00:03:11 Speaker 1
You know.
00:03:13 Speaker 1
Right, exactly.
00:03:16
Please.
00:03:17 Speaker 1
Yes.
00:03:19 Speaker 1
Oh my God, I love it.
00:03:21 Speaker 2
Pink hot pink.
00:03:22 Speaker 2
Same colour.
00:03:23 Speaker 2
So funny, it’s not really seasonal.
00:03:24 Speaker 1
I know I love it.
00:03:26 Speaker 2
We were just 20s.
00:03:27 Speaker 1
I.
00:03:27 Speaker 2
Know and also RuPaul. Can I just say so.
00:03:30 Speaker 2
We’re not going to talk about.
00:03:31 Speaker 2
Today.
00:03:31 Speaker 2
But um, I in my off hours.
00:03:34 Speaker 2
I now teach a fitness class.
00:03:36 Speaker 2
You know, way Pilates.
00:03:36
Did.
00:03:37
Yeah.
00:03:38 Speaker 1
Oh my God.
00:03:38 Speaker 2
And litter at a gym called where I thrive.
00:03:39 Speaker 1
Where I want to take it.
00:03:40 Speaker 2
You should oh my God.
00:03:41
Yeah.
00:03:41 Speaker 1
I’ve been there cause
00:03:42 Speaker 2
OK.
00:03:43 Speaker 1
And I I got like a free one month.
00:03:43 Speaker 2
You say does for there.
00:03:45 Speaker 2
Well, you could be my guest.
00:03:46 Speaker 1
Can’t speak oh my God.
00:03:47 Speaker 2
Um, but I have a new playlist every month is yes, new playlist, new month, new playlist, December.
00:03:51 Speaker 1
OK, love it.
00:03:53 Speaker 2
Had a playlist started my playlist.
00:03:55 Speaker 2
When was this on Sunday? Monday.
00:03:56 Speaker 1
OK, OK.
00:03:58 Speaker 2
Hadn’t quite listened through all of the songs.
00:04:00 Speaker 2
Yep, put a RuPaul song on there.
00:04:02 Speaker 2
Hmm, we’re halfway through this like core exercise, and I realize this song is so spicy.
00:04:10 Speaker 2
Like it has some real inappropriate language.
00:04:15 Speaker 2
Yeah.
00:04:16 Speaker 2
So I had to hop up and you know, I don’t want to censor RuPaul, but.
00:04:20 Speaker 2
Just just.
00:04:21 Speaker 1
That that is.
00:04:21 Speaker 2
That’s too spicy for 6:45 on a Monday morning.
00:04:25 Speaker 1
That is so funny.
00:04:26 Speaker 1
I often have.
00:04:27 Speaker 1
I will often when I’m teaching, I usually play a song, especially when I’m teaching like stats or something.
00:04:33 Speaker 2
I’ve got it open.
00:04:33 Speaker 1
It’s kind of boring, so I try to.
00:04:34 Speaker 1
Like you know, with a song.
00:04:36 Speaker 1
Yes.
00:04:37 Speaker 1
Yeah.
00:04:37 Speaker 2
Does that have meaning like it was girl Anthem post election or?
00:04:37 Speaker 1
Meaning, when some, well, sometimes it’ll be more like super like dork.
00:04:46 Speaker 1
Like dad joke.
00:04:47 Speaker 1
Stuff like when we’re talking about like measures of central tendency, I’ll play the song the middle.
00:04:48
OK.
00:04:52 Speaker 1
That’s like, why can’t you?
00:04:54 Speaker 1
Meet me in the middle.
00:04:55 Speaker 1
I’m really ****** that up right now.
00:04:57 Speaker 1
Sorry, I’m also trying not to swear as much, on this podcast and I cannot stop because I hope I want the university to like promote it.
00:04:58
That’s great.
00:05:04 Speaker 1
But eventually, once I have an episodes, but I’m like they won’t listen to it, they’ll just be like, oh, professor.
00:05:09 Speaker 1
And then.
00:05:10 Speaker 2
Just keep it out.
00:05:10 Speaker 1
People I know, I did do that.
00:05:13 Speaker 1
On my episode with Ivy cause she is so professional she doesn’t like swear and I was like so I I beeped out some of my swears cause there weren’t.
00:05:22 Speaker 1
I was like on better behavior.
00:05:22 Speaker 2
In contrast, didn’t deserve it this.
00:05:24 Speaker 1
Yeah, right.
00:05:25 Speaker 1
Other times I just say explicit anyway anyway, but before my class, and so I tried.
00:05:27
OK.
00:05:31 Speaker 1
There’s certain songs where I’m like, Oh my God, I want to play that.
00:05:33 Speaker 1
But like there’s certain like, I’m not going to play it if it has the N word like I’m not like, judging like I I’m not gonna tell black people what to do, but as a white woman, I’m not gonna be doing that.
00:05:37 Speaker 2
And Oh yeah.
00:05:40 Speaker 2
Bitpop culture.
00:05:41 Speaker 2
Yeah, no, not appropriate.
00:05:44 Speaker 1
Or songs that could be triggering, you know, like I’m trying to be thought.
00:05:46 Speaker 2
Yeah, yeah.
00:05:47 Speaker 1
But it is like it’s hard, so I’m always asking people like students in the class or my kids.
00:05:52 Speaker 1
I’m like, will, I get cancelled.
00:05:54 Speaker 1
Who is this cause?
00:05:54 Speaker 1
Also my songbook is from like 1995 and so I’m always like hey could I play?
00:06:00 Speaker 1
Kendrick Lamar was one recently and then I was like, no.
00:06:03 Speaker 1
No, I cannot.
00:06:04 Speaker 1
I was like, I think he’s awesome.
00:06:06 Speaker 1
Yeah.
00:06:06 Speaker 1
And I think what he does is awesome, but no, no, I will not be able to play any of those songs.
00:06:12 Speaker 1
So anyway, but it’s it’s hard because it’s like you want to set the vibe and it’s.
00:06:12
Yeah, it’s tricky.
00:06:18 Speaker 1
Anyway, that’s the whole thing.
00:06:18 Speaker 2
Yeah, I think F word is fine, but, you know, derogatory terms like to see where the B word and totally and obviously the end word.
00:06:22
Hmm.
00:06:26 Speaker 1
Yeah.
00:06:26 Speaker 2
Anyway, sorry that was a bit of a side part.
00:06:27 Speaker 1
No, I’m just.
00:06:28 Speaker 1
No, I love it.
00:06:29 Speaker 2
Thank you for that introduction.
00:06:29 Speaker 1
I love it.
00:06:31 Speaker 2
That was very kind and I love that you said prestigious research.
00:06:35 Speaker 2
Here it is prestigious.
00:06:36 Speaker 1
It was.
00:06:37 Speaker 1
I’ll just admit that there was some AI involved in the writing of it, but I read it.
00:06:42 Speaker 1
Did I love telling this story?
00:06:44 Speaker 1
And I think again, we’ll get to my first question, but.
00:06:47 Speaker 1
When I was an undergrad, I get very angry about cheating and plagiarism when students do it, and I think it’s because like I always was like kind of a do gooder like I wasn’t an A+ student by any stretch, but I would never cheat and so it really made me mad.
00:07:02 Speaker 1
And I remember that my now ex-husband we went to undergrad together.
00:07:07 Speaker 1
We.
00:07:07 Speaker 1
And he had a friend who paid for a paper.
00:07:10 Speaker 1
So this was in the 90s before the Internet.
00:07:11 Speaker 2
Oh wow.
00:07:12 Speaker 2
Those early days that kind of falsification, yeah.
00:07:12 Speaker 1
She right.
00:07:13 Speaker 1
She right.
00:07:15 Speaker 1
And she paid something like 700.
00:07:17 Speaker 2
Whoa.
00:07:17 Speaker 1
Dollars or five?
00:07:18 Speaker 1
I can’t remember.
00:07:19 Speaker 2
So that’s wrong on so many levels, just the.
00:07:19 Speaker 1
I remember right.
00:07:20 Speaker 1
If felt like a lot and she got a seat on it and she was really mad cause she paid so much money for it and I think I know well.
00:07:24
Oh wow.
00:07:25 Speaker 2
Ha ha.
00:07:27 Speaker 2
Ah.
00:07:30 Speaker 1
But then I paid good money for this.
00:07:31 Speaker 1
Exactly.
00:07:32 Speaker 1
And my ex husbands like but you didn’t write it.
00:07:32
No.
00:07:33 Speaker 1
She’s like, well, I read it.
00:07:34 Speaker 1
And I agreed with it.
00:07:35 Speaker 1
And she literally took it all the way, and she brought to the professor.
00:07:38 Speaker 1
They didn’t.
00:07:39 Speaker 1
Visit enough brought it to the department to write, and she, like, brought it to like the highest levels and eventually got it raised to like a B plus or something.
00:07:45 Speaker 1
And I’m like, I know, right?
00:07:46 Speaker 2
That’s hilarious.
00:07:47 Speaker 2
Of course you care about cheating, not just because I mean most of us who end up where we are.
00:07:52 Speaker 2
We’re probably sticker children who were externally motivated on some level and, you know, got a lot from getting, doing, working hard and doing well.
00:07:54 Speaker 1
Next, totally.
00:07:58 Speaker 2
In school so.
00:07:59 Speaker 1
Yes.
00:08:00 Speaker 2
So you care about justice.
00:08:01 Speaker 1
Exactly.
00:08:01 Speaker 2
Factor doing Social Research, right?
00:08:02 Speaker 1
Exactly like.
00:08:03 Speaker 2
Whenever my kids are.
00:08:04
That’s not fair.
00:08:06 Speaker 2
This is good.
00:08:06 Speaker 2
This shows that you care instead.
00:08:06 Speaker 1
Yeah, I know.
00:08:08 Speaker 1
I know, although I’ve come to as a mother, I’ve come to be like.
00:08:11 Speaker 1
Yeah, life’s not fair.
00:08:12 Speaker 1
Get used to it.
00:08:13 Speaker 2
All that for two, and the distinctions between equity or equality, you know, or equality and equity.
00:08:13 Speaker 1
I am the worst.
00:08:19 Speaker 1
Totally.
00:08:19 Speaker 2
Fairness, right?
00:08:20 Speaker 2
Doesn’t mean everyone gets the same.
00:08:22 Speaker 1
Exactly.
00:08:22 Speaker 2
That’s hard for you.
00:08:23 Speaker 2
You have multiple children, right?
00:08:24 Speaker 2
And I find this hard with my two kids, who are very different in age and ability.
00:08:24 Speaker 1
So many, yeah, yeah.
00:08:28 Speaker 2
And um, it’s tricky.
00:08:30 Speaker 1
It really is.
00:08:31 Speaker 2
Of course, you’re not gonna get the same.
00:08:32 Speaker 1
Yeah, everybody has different needs, right?
00:08:33 Speaker 2
That’s that’s more than you Oscars 13.
00:08:35 Speaker 1
Yeah, totally.
00:08:37 Speaker 2
Totally you can do more on ought to do more anyway.
00:08:41 Speaker 1
Anyway, so let’s.
00:08:42 Speaker 1
But this is all this is all relevant.
00:08:44 Speaker 1
This is all awesome.
00:08:44 Speaker 1
So I always start with my titular question, which is a word that makes me giggle.
00:08:49 Speaker 2
Yeah, anywhere Speaking of spicy.
00:08:50 Speaker 1
Anyways, me Twitter titular question.
00:08:54 Speaker 1
And you.
00:08:54 Speaker 1
I’m tryna I wanna become like super mature.
00:08:57 Speaker 1
You know I’m.
00:08:58 Speaker 1
I’m so imature and I’m, you know, I’m trying to be the, you know, the Auntie Joe Rogan.
00:09:04 Speaker 1
Like first, prov.
00:09:05 Speaker 1
Super influencer, so I gotta build my brand.
00:09:06 Speaker 2
Love it.
00:09:08 Speaker 1
This so being doing Social Research.
00:09:08 Speaker 2
OK.
00:09:10 Speaker 1
So tell me what a research are you doing these days?
00:09:14 Speaker 2
OK these days.
00:09:16 Speaker 1
Or anytime like really I do it just for branding, but I don’t. I wanna talk about ever research.
00:09:21 Speaker 2
You wanna talk about?
00:09:22 Speaker 2
So I am I call myself now a reluctant AI scholar.
00:09:27 Speaker 1
OK, I love it.
00:09:28
Yes.
00:09:29 Speaker 2
Um, everyone, of course, is an AI scholar.
00:09:32 Speaker 2
Yeah, but and as you know, I have this book.
00:09:35 Speaker 2
Which we will probably talk about hand.
00:09:37 Speaker 1
Oh my God, I didn’t mention your book in the thing.
00:09:39 Speaker 1
Which now I’m mad at myself, but also I haven’t sitting right in front of me.
00:09:42 Speaker 1
I just want to pause for a second because also I talked about this with somebody else on another episode.
00:09:48 Speaker 1
Is book covers and I know you and I had a lot of like stress and strain, but I actually think you’re your book ever is really cool.
00:09:50 Speaker 2
I did just.
00:09:52 Speaker 2
That’s a lot of strength.
00:09:54 Speaker 1
Anyway, the book is called and it has one of the best titles.
00:09:57 Speaker 1
I just love it.
00:09:59 Speaker 1
The Immaculate Conception of data agribusiness activists and their shared politics of the future.
00:10:05 Speaker 1
This book came out in 2021-2022 with McGill, Queens University Press, and it’s an awesome book, and I want to.
00:10:10 Speaker 2
That’s forever, yeah.
00:10:14 Speaker 1
I want to get into it, so I I know we did.
00:10:15 Speaker 2
We shared a press, we shared another, um, my friend just did the collage on the cover.
00:10:17 Speaker 1
We had a shared editor and it’s very cool.
00:10:21 Speaker 1
No way.
00:10:22 Speaker 2
Yeah.
00:10:22 Speaker 1
I did not.
00:10:22 Speaker 2
She’s brilliant.
00:10:23 Speaker 1
Arts
Farts.00:10:23 Speaker 2
Artist.
00:10:24 Speaker 2
Yep.
00:10:24 Speaker 1
I love it.
00:10:25 Speaker 2
And so that was nice.
00:10:26 Speaker 1
It’s.
00:10:27 Speaker 1
Yeah, I hate my book cover.
00:10:28 Speaker 2
But before you do, I don’t hate it.
00:10:28 Speaker 1
Yours. I just think it like captures.
00:10:31 Speaker 1
It’s just if you look at a thumbnail, you can’t see the title.
00:10:34 Speaker 1
It’s like it’s like it’s too there’s too much happening on it, but anyway.
00:10:35 Speaker 2
Yeah.
00:10:40 Speaker 1
Anyway, so let’s talk about the research you’re doing lately and about your book, but also stuff more.
00:10:43 Speaker 2
Yeah.
00:10:45 Speaker 1
All it’s it’s all related.
00:10:45 Speaker 2
OK, what am I doing so I’m I continue to work on this thing called a I you I I’m nervous about claiming expertise.
00:10:58 Speaker 2
You said I’m so excited to have an expert on AI here.
00:11:02 Speaker 2
Only because I would say that I am perhaps an expert, or at least I research with some kind of depth.
00:11:11 Speaker 2
The social dimensions of artificial intelligence, the kind of social and more so social justice impacts which I’m always keen to distinguish that from, say, the ethics of AI, which is a big part of the kind of social conversation which means the conversation that ends up being critical, critical as an analytical right, not necessarily.
00:11:14 Speaker 1
Yeah.
00:11:17
Hmm.
00:11:23
Oh.
00:11:24 Speaker 1
Interesting.
00:11:32 Speaker 2
Negative gets, I think,
00:11:35 Speaker 2
Circumscribed around well, to be honest, law like, yes.
00:11:40 Speaker 2
Yeah.
00:11:41 Speaker 2
So anyway, that’s a whole other thing we could talk about that, yeah.
00:11:42 Speaker 1
No, no, no, not anyway.
00:11:43 Speaker 1
No, no.
00:11:44 Speaker 1
But but just for a sec, cause I think that that’s also important for distinguishing between Social Research and like other kinds of research, but also think like I think that’s a really I never really thought about it that way.
00:11:55 Speaker 1
But like, just tease that out a little bit more for us about like the difference between ethics and then like the social implications of something like, you know, cause there is like this.
00:12:04 Speaker 2
Right.
00:12:05 Speaker 1
Like there’s morals, there’s ethics.
00:12:07 Speaker 1
And then there’s also, just like facts of what consequences on different social groups and like so.
00:12:13 Speaker 2
Yeah, exactly.
00:12:13 Speaker 1
She’s that out, yes.
00:12:14 Speaker 2
And it’s the latter that I’m more just like you, right?
00:12:16 Speaker 2
That’s what I care about.
00:12:19 Speaker 2
Is really power fundamentally.
00:12:22 Speaker 2
Um, yeah, there are different ways I guess, to tease that out, that distinction.
00:12:26 Speaker 2
But in a really practical way.
00:12:28 Speaker 2
So this was years ago when I was working on the book.
00:12:31 Speaker 2
And it’s interesting, I think that the book has data in the title because when I started the book in 2016, everyone, not just social researchers or those doing Social Research, but right, members of the public, we’re talking about data and everyone, it was like around just before, I suppose, but 20/17/2018 there was increasing awareness around the.
00:12:35 Speaker 1
I know, yeah.
00:12:52 Speaker 2
Says and misuses of our personal data, mostly collected from online environments.
00:12:57 Speaker 2
Right.
00:12:57 Speaker 2
Cambridge Analytica scandal and the whole, you know, gaming of the American election.
00:12:59
Yes.
00:13:02 Speaker 2
Electoral politics more broadly.
00:13:03 Speaker 2
Brexit, you know, and we had, like Facebook testifying before Congress, Collins Stretch and that really weak test tested testimony before US.
00:13:04 Speaker 1
Facebook.
00:13:13 Speaker 2
Congress, and then of course, we had Mark Zuckerberg, so there was just people were sort of aware and then I think even Elon Musk, which I think it.
00:13:22 Speaker 2
Retrospect is quite funny, but you know he he.
00:13:24 Speaker 1
What did they say?
00:13:25 Speaker 1
I don’t remember.
00:13:25 Speaker 2
Well, he I’m pretty sure that he’s started this campaign online.
00:13:29 Speaker 2
Quit hashtag, quit Facebook.
00:13:31 Speaker 2
And it was like, we’re all going to.
00:13:32 Speaker 1
What?
00:13:34 Speaker 2
I just remember my students really being concerned and curious.
00:13:38 Speaker 2
And you know, I had this book that’s MIT press, this tiny little black book called obfuscation.
00:13:44 Speaker 2
And people were like, actively trying to write find different ways of behaving online to make themselves less visible or less amenable to the kind of harvesting, which is what Shoshana Zuboff calls.
00:13:58 Speaker 2
It Shoshana Zuboff sorry of of personal data.
00:14:02 Speaker 2
So anyway, everyone was talking about data, right?
00:14:04 Speaker 2
We were having like, a real moment of awareness and and public visibility.
00:14:09 Speaker 2
And so my whole book is about data, yeah.
00:14:12 Speaker 2
But really it could be it could be AI.
00:14:14 Speaker 2
Now everyone talks about I.
00:14:16 Speaker 2
Yeah.
00:14:17 Speaker 2
And I actually think, and I asked an historian of AI, Luke Stark, about this two years ago.
00:14:24 Speaker 2
And he said yes, I think I agree.
00:14:25 Speaker 2
But I don’t know if anyone has actually charted this, I actually think.
00:14:29 Speaker 2
The industry has sort of led social researchers, even critical social researchers, away from the data conversation, right?
00:14:36 Speaker 1
That’s so interesting.
00:14:36 Speaker 2
So now there’s, like, critical AI studies.
00:14:38 Speaker 2
But you know the the journal where people publish is still big data and society.
00:14:43 Speaker 2
Basically, everyone stopped talking about data and started talking about AI, and I actually think maybe some of that was.
00:14:50 Speaker 2
Tactical switch on the part of industry, right is like everyone was aware of data and so then people started industries.
00:14:56 Speaker 1
Like big Data is working.
00:14:57 Speaker 2
Yeah, industry started leading us toward conversations around AI and then leading the critical conversation, right?
00:15:02 Speaker 2
Those like open letters around AI and the concern actually this comes back to this is a super secure just answer to your question, but it comes back to your question about about what’s the distinction social justice versus the other kinds of maybe philosophical question.
00:15:10 Speaker 1
No, no, no.
00:15:10 Speaker 1
This is awesome.
00:15:16 Speaker 2
Mens, I actually think in this move and again I have not looked at this systematically.
00:15:16
Yeah.
00:15:22 Speaker 2
Um, but this move toward talking about AI as opposed to data?
00:15:27 Speaker 2
I suspect it’s sort of been led by industry and you have these open letters, right, that industry folks like Sam Altman have written on AI and some of the potential negative consequences.
00:15:39 Speaker 2
And I think if one were to look at it systematically, the consequences are always those big kind of philosophical or maybe moral questions, right?
00:15:47 Speaker 2
Like, what’s the distinction between human reasoning and computer reasoning, and is this the end of human reasoning?
00:15:53 Speaker 2
And are robots going to turn on us?
00:15:55 Speaker 1
Yeah, yes, yes.
00:15:55 Speaker 2
And like the existential right?
00:15:58 Speaker 2
And then the conversation is always like, oh, but don’t worry, we got this right, like in those open letters.
00:16:02 Speaker 1
Rates. Rates.
00:16:03 Speaker 2
It’s always like big existential threat, and we know the power of these tools and we know them best and so leave it to us right.
00:16:09 Speaker 1
Yes.
00:16:11 Speaker 2
It kind of really conditions that governance space around AI and I, but I think all this to say.
00:16:18 Speaker 2
It’s weird that my book is about data because it really could be.
00:16:21 Speaker 2
It is a book about AI and we can come back to talking about that in a second.
00:16:22
Hold on.
00:16:25 Speaker 2
But back to the conversation about or the question about social justice versus the kind of ethics.
00:16:32 Speaker 2
I think that’s a part of it that I’m always trying to say.
00:16:35 Speaker 2
Yes, there are those really important questions that probably philosophers mostly should puzzle through.
00:16:42
Yeah.
00:16:43 Speaker 2
Maybe in concert with the technologists right about, like the distinctions, but humans and, you know and like, how do we create?
00:16:51 Speaker 2
A good as in functioning, but also good for a large number of people.
00:16:55 Speaker 2
Human machine compromise and how do we set limits around computer reasoning such that, you know we don’t have the eye robots scenario, but that’s a reference that late 90s film with Will Smith, who signed it think cancelled, but um.
00:17:09 Speaker 1
He’s always he’s cancelled his necklace so that I never.
00:17:10 Speaker 2
But if you cancel it, I don’t know.
00:17:12 Speaker 1
Never.
00:17:13 Speaker 2
I mean, he did.
00:17:14 Speaker 2
Perform an act of physical violence in public.
00:17:16 Speaker 2
Against him, it was it, a woman.
00:17:18 Speaker 2
No, no it wasn’t.
00:17:18 Speaker 1
Who was a guy who’s in defence of his wife?
00:17:19 Speaker 2
It was offensive woman, true.
00:17:21 Speaker 1
It’s it’s a tricky 1.
00:17:22 Speaker 2
Was it tricky one I know.
00:17:23 Speaker 2
Anyway, sorry but, but like yeah so so.
00:17:27 Speaker 2
So.
00:17:28 Speaker 2
You know, I think it’s a very practical.
00:17:29 Speaker 2
It’s like who gets to enter into these conversations around?
00:17:33 Speaker 2
What are the risks and how do we define the risks and those two are related, right?
00:17:36 Speaker 2
How we define them then matters for who gets involved in the conversation about how to best mitigate them, and if we define the risks as only these big moral existential rate and not like who’s made more powerful by these technologies, whose disempowered right and do these tools fundamentally reproduce harm to particular, you know, historically made marginal social group.
00:17:59 Speaker 1
Yes.
00:18:04 Speaker 2
This the kind of power questions and the justice questions.
00:18:07 Speaker 2
Those are the the things that I work on and that I’m fundamentally concerned about.
00:18:11 Speaker 2
But yeah, I’m both are reluctant AI scholar.
00:18:15 Speaker 2
And then I’m also partly because of my candidate research here title.
00:18:17 Speaker 2
And because we happen to be in Ottawa, continuously pulled into the governance space, which is a great space to be, and there’s lots happening and it’s all important.
00:18:26 Speaker 2
But I’m often the only social scientist or person doing Social Research in the room with a bunch of lawyers.
00:18:33 Speaker 2
Love lawyers?
00:18:33 Speaker 2
Really important, right?
00:18:35 Speaker 2
But you know, the conversation is very much like either around how do we set the bar or how do we compare the outcomes from this particular AI to.
00:18:47 Speaker 2
To the law, which is basically setting the bar around legal compliance, right?
00:18:49 Speaker 1
Yeah.
00:18:52 Speaker 2
Right.
00:18:53 Speaker 2
Or how do we maybe some of the conversation now is like stretched to how do we compare the uses and potential future uses of this tool against things like the equity Act?
00:19:05 Speaker 2
Which would is the best case, but mostly it’s human rights, and that’s really important too.
00:19:10 Speaker 2
But I’m always like, let’s just let’s think a bit broader, right?
00:19:13 Speaker 2
Beyond legal compliance and beyond.
00:19:16 Speaker 2
Historic legislation because legal tools are inherently conservative to think about what are the best outcomes for the most number of people, and specifically for people who are historically or currently made marginal.
00:19:27 Speaker 1
Oh my God, I love this so much.
00:19:29 Speaker 1
And this is like I have so many thoughts and this is like like when everybody to listen to every episode of my podcast, not just so I can build my brand, but because it is so fascinating to hear you at this point after talking to so many people with different attitudes differ.
00:19:44 Speaker 1
And approaches different feelings and and just like various thoughts on it and like one thing that has come up in multiple episodes is my love of Hannah Arendt and the book the human condition at, Oh my God.
00:19:55
Oh, such a good.
00:19:56 Speaker 2
Bug, but in fact a preference is almost my favorite part of that book where.
00:20:00 Speaker 1
She budnick. Exactly. This is like, I was just.
00:20:00 Speaker 2
Talks about Sputnik.
00:20:02 Speaker 1
I was like Willow brought it.
00:20:03 Speaker 1
Willow scobie, in another episode brought brought up, I think, really.
00:20:06 Speaker 1
Yes, yes.
00:20:07 Speaker 2
That’s interesting.
00:20:07 Speaker 1
And I brought it up with.
00:20:08 Speaker 2
Of course we’re.
00:20:09 Speaker 1
All alright, but it is like her whole point and this is what I just keep saying is her question.
00:20:09 Speaker 2
My colleague.
00:20:13 Speaker 1
The whole thing about spin, it’s like it’s it’s a simple, modest question and then a very thick, very dense long book.
00:20:19 Speaker 1
I just want us to think, what are we doing?
00:20:22 Speaker 1
And it’s like and you can see when she’s talking about the ways exactly.
00:20:26 Speaker 2
Why?
00:20:27 Speaker 1
It’s like, why are we doing this?
00:20:29 Speaker 1
Like what brought us here and and I also super interesting like I started I just started to listen to cause I keep getting these arguments with my kids about like what was the cause of the most recent US election and I so I decided to start listening to the audio book of the Origins of Totalitarianism by aren’t and.
00:20:49 Speaker 1
I’m and then I was like, Oh my God, this is so good.
00:20:53 Speaker 1
But then, because I’m a nerd mom, I made my kid read to me.
00:20:56 Speaker 1
I was like they were.
00:20:57 Speaker 1
He wanted to come over for dinner and I was like, OK, but I’m like, doing stuff.
00:21:02 Speaker 1
So here put your computer and find me a critique of this, because before I’m going to start telling you that this is the right thing.
00:21:07 Speaker 1
Find me a credit.
00:21:08 Speaker 1
He’s a sociology major in 30 year right now.
00:21:11 Speaker 1
Yeah, it’s like, so fun.
00:21:12 Speaker 2
I love that you’ve created a critical conversation for him.
00:21:15 Speaker 2
I know parenting.
00:21:15 Speaker 1
And it’s like, that’s part of what frustrates me in our debates about the election is.
00:21:21 Speaker 1
A kind of certainty that they have with their ideas, and I have often been accused by various husbands and children that it’s like you just think what you think and you’re like, you’re not critical of yourself.
00:21:33 Speaker 1
You just like, aren’t listening.
00:21:34 Speaker 1
Other ideas and and also like people who will say like you know, accuse me of like wokeness.
00:21:40 Speaker 1
And I’m like, oh, my God, my kids call me.
00:21:42 Speaker 1
I’m like just a dumb Lib that I’m like not woken up.
00:21:45 Speaker 2
Term now, yeah.
00:21:45 Speaker 1
Hmm I.
00:21:46 Speaker 1
Yes, that it’s like cause I’m not left enough for them, right.
00:21:48 Speaker 2
Liberalism offensive, OK.
00:21:50 Speaker 1
So it’s I’m too like moderate basically.
00:21:54 Speaker 1
And so I’m like, look, I’m like to me, social science, like any science, is about trying to falsify our claims.
00:22:01 Speaker 1
To me it is always about trying to disperse anyway, so I’m like, Jack, find me a critique and pretty quickly I was like.
00:22:06 Speaker 1
This is done and I’ll tell you why, but keep reading anyway, so I’m like I want good evidence, but what was really helpful is he’s the guy, the article who wrote it.
00:22:15 Speaker 1
I shouldn’t say he was dumb.
00:22:16 Speaker 1
It was.
00:22:17 Speaker 1
There were certain things that I disagreed, but I wasn’t looking at.
00:22:19 Speaker 1
I was like listening and cooking dinner, so my apologize to this person.
00:22:22 Speaker 1
I probably should put him in this show.
00:22:24 Speaker 1
Notes he’s an expert on, the internet
00:22:26 Speaker 1
But one thing that he said that I think makes sense given what I’ve read about the human condition, is that what is?
00:22:32 Speaker 1
What explains totalitarianism?
00:22:36 Speaker 1
One of the things that allows totalitarian leaders to take form to come to power is their ability to transcend context.
00:22:45 Speaker 1
And one of the things that I love about the human condition is, and her critique of science and technology is this idea that we as humans are constantly trying to transcend the human condition that we are trying to escape.
00:22:57 Speaker 1
That why spending matters like we want to go to outer space.
00:23:00 Speaker 1
So we can avoid being human.
00:23:01 Speaker 1
And she’s like.
00:23:01 Speaker 1
I’m sorry friends, we can’t and.
00:23:03 Speaker 2
Now you’ve got to stay with the trouble.
00:23:05 Speaker 2
As Donna here with some.
00:23:06 Speaker 1
Exactly.
00:23:07 Speaker 1
And So what I just.
00:23:09 Speaker 1
And it was like, holy ****.
00:23:10 Speaker 1
And I think that what you’re describing is like with the whole big data and this shift to AI.
00:23:17 Speaker 1
It’s like what people in big tech in these firms are doing as they are transcending like they have the resources in the capacity to transcend context.
00:23:27 Speaker 1
That what they’re doing and it’s like, holy ****, you are so right.
00:23:30 Speaker 1
Like as I’m thinking about it like my 2016 big data was everything like that was everywhere.
00:23:35 Speaker 1
And so, of course it’s on the tide in the title of your book.
00:23:39 Speaker 1
But I want to know more like so to me that like is a quintessential.
00:23:45 Speaker 1
Characteristic of effective rhetoric is, and also the last recording I did was with a dear old friend of mine, Maggie Werner, who’s a rhetorician at William and Robert Smith colleges in upstate New York.
00:23:58 Speaker 1
And she does all kinds.
00:23:59 Speaker 1
And because she teaches a lot of writing, I was like, what are your thoughts on AI for writing like?
00:24:03 Speaker 1
And she’s like, you know, I think a lot of it is a moral panic.
00:24:06 Speaker 1
I think it’s awesome.
00:24:07 Speaker 1
I use it all the time.
00:24:07 Speaker 1
It’s really helpful.
00:24:09 Speaker 1
And so it’s so fascinating to me to hear you talk.
00:24:13 Speaker 1
This is my very long winded answer to a question that nobody asked.
00:24:19 Speaker 1
That it’s like this idea that we have that there has been this shift that does feel like there’s a kind of moral panic around AI in the way that there was a moral panic around.
00:24:33 Speaker 1
Like big data, but they’re also, is there is something real behind it that there is, but it’s like we’re being distracted by like there are problems with it, but not the problems that we think are the problems.
00:24:48 Speaker 1
So can you tell me more?
00:24:49 Speaker 2
Yeah, that’s a really good way.
00:24:49 Speaker 1
Like, tell me more, what are the problems?
00:24:51 Speaker 2
The distraction is a great I think of that, you know, like a.
00:24:53 Speaker 2
Magician like Presto digitation.
00:24:55 Speaker 2
Which makes me sound like a real conspiracy theorist, and I don’t think it’s like, you know, one person like Sam Altman sitting in a chair saying like, hey, folks, you know, we need to talk about AI and reframe the whole conversation.
00:24:58 Speaker 1
No.
00:25:01
Yeah, yeah.
00:25:05 Speaker 1
No, and it also if I could just pause for one second, it’s like it is in a certain sense, I can be a real.
00:25:06 Speaker 2
But I do, yeah.
00:25:11 Speaker 1
What’s the like rationalist like?
00:25:13 Speaker 1
Social thinking.
00:25:14 Speaker 1
Sociology, like 1950s.
00:25:18 Speaker 1
What’s?
00:25:18 Speaker 1
That like James Coleman.
00:25:20 Speaker 1
I’m forgetting the like theoretical school of thought that I’m claiming to be.
00:25:23 Speaker 2
Like structural
00:25:23 Speaker 1
A part of no.
00:25:26 Speaker 1
But more just like that, there’s like like there’s rational, like rationalism like that.
00:25:32 Speaker 1
There’s a kind of rational thought.
00:25:33 Speaker 1
It’s not a conspiracy.
00:25:34 Speaker 1
It’s not that like Sam Altman or these people are like evil.
00:25:36 Speaker 1
It’s that anyone, like, I’m a pragmatist.
00:25:39 Speaker 1
Like anyone who has that kind of anyone who’s gonna wanna.
00:25:43 Speaker 1
Be self protective.
00:25:44 Speaker 1
Anyone is going to want to do what like will maintain their position of whatever.
00:25:48 Speaker 1
And so it’s not that they’re like evil, it’s that they do evil as a means of just doing business.
00:25:56 Speaker 1
Like, that’s just the nature of it.
00:25:57 Speaker 2
Oh for sure.
00:25:58 Speaker 1
And so yeah.
00:25:58 Speaker 2
It’s business as usual.
00:26:00 Speaker 2
When you say like.
00:26:00 Speaker 2
Yeah, exactly.
00:26:01 Speaker 2
It kind of distraction.
00:26:02 Speaker 2
I think that’s it.
00:26:03 Speaker 2
It’s, you know, for me doing Social Research is it is.
00:26:07 Speaker 2
Is drawing attention to these kinds of ways that power moves right through society.
00:26:11 Speaker 2
The insidious ways the hidden ways, often through rhetoric and language, which is why I left the lab bench and did my degree in communication studies.
00:26:18 Speaker 2
Because I started to realize the place of language mostly in the US courts.
00:26:23 Speaker 2
Um in creating boundaries around who could participate in policymaking around controversial technologies.
00:26:30 Speaker 2
And we can come back.
00:26:31 Speaker 2
For that, but you know, I think in 2016 or 17 or whenever it was that we’re saying there was this conversation around data.
00:26:38 Speaker 2
I think there was more of a conversation, less around the morality and the ethics, and let’s say the philosophy understood broadly and more around the political economy.
00:26:47 Speaker 2
Right, because people back to the business as usual, I think people started to see ah, like the uses of data, the monetization or I think my colleague Keene Birch at York would say acetylation of data sets was the new business model for.
00:27:04 Speaker 2
I mean, let’s call the media companies cause even though they’ll right evade that kind of regulatory distinction, I think they are right.
00:27:13 Speaker 2
And people who are setting the conversation or companies that were setting the conversation like Facebook, and there was, we started to see, I think, as ordinary people.
00:27:21 Speaker 2
Await a second, right?
00:27:22 Speaker 2
Every move online is like, you know, captured or harvested again.
00:27:27 Speaker 2
And then those data are collated and hey, people are making money from my data sets and and using them in many ways.
00:27:34 Speaker 2
Um against against the I don’t know.
00:27:38 Speaker 2
Social good, which is problematic framing.
00:27:40 Speaker 2
But yeah, and so.
00:27:43 Speaker 2
And so yeah, I think that it is a kind of distraction.
00:27:45 Speaker 2
You’re.
00:27:45 Speaker 2
But I think that it was a distraction away from.
00:27:48 Speaker 2
I think the conversation was more about the political economy before we started talking about AI.
00:27:53 Speaker 1
Do you?
00:27:53 Speaker 1
Do you think part of that was because, like just on a like, thinking as a human being, it’s like I get excited about AI because I can see how AI can help me.
00:28:02 Speaker 1
It’s like it.
00:28:03 Speaker 1
Honestly, everyone, the bio was I was running late.
00:28:04 Speaker 2
It’s very powerful.
00:28:06
Yeah.
00:28:07 Speaker 1
I was like, write me a bio.
00:28:09 Speaker 1
About Kelly Bronson
00:28:10 Speaker 1
And there you go.
00:28:11 Speaker 1
And I was like, that looks great.
00:28:12 Speaker 1
I of course missed your book, and if I had been better, I would have.
00:28:16 Speaker 1
I would have written it probably a little differently, but it worked.
00:28:19 Speaker 1
It did the job, whereas big data in 20, like you know, Cambridge Analytics, I didn’t do **** for me like that.
00:28:24 Speaker 1
All that did was try to like you know, it gave me better ads or whatever.
00:28:29 Speaker 1
Do you think there’s something?
00:28:30 Speaker 1
Yeah, about this, where it’s like, like people were more could see the how it was problematic socially because it didn’t do anything for them.
00:28:38 Speaker 2
Personally, I think you’re quite right.
00:28:40 Speaker 2
Yeah, I think that, you know, we are being socialized.
00:28:46 Speaker 2
Um AI is integrated, right?
00:28:48 Speaker 2
Yeah, absolutely.
00:28:50 Speaker 2
I think that may be part of it.
00:28:52 Speaker 2
Yeah, I don’t know.
00:28:52 Speaker 2
I feel like this is a whole side project that one could do.
00:28:55 Speaker 2
Thinking about like tracing the critical conversation.
00:28:57 Speaker 1
And but because it’s so accessible too now, it’s like even like on LinkedIn now it’s like, do you wanna write your little, you know, posting or here have a I do it.
00:29:07 Speaker 1
It’s like it’s not like and there was a time like my son, Jack who’s like very anti AI.
00:29:08 Speaker 2
I know.
00:29:13 Speaker 1
He’s like the robots are going to kill him.
00:29:14
Oh, interesting.
00:29:15 Speaker 1
I know he is like I hate it.
00:29:17 Speaker 1
Nobody.
00:29:17 Speaker 1
She he like we got into.
00:29:18 Speaker 1
This.
00:29:18 Speaker 1
Huge fight a few weeks ago cause I was like saying like.
00:29:21 Speaker 2
He thinks people shouldn’t use it because we’re sort of effectively making.
00:29:24 Speaker 2
It more powerful or feeding it we?
00:29:25 Speaker 1
I’m.
00:29:26 Speaker 1
Yeah.
00:29:26 Speaker 1
And I think also, you know, the environmental cost that it’s like that you know that there are consequences to and that it’s also like.
00:29:28 Speaker 2
Yeah, absolutely, yeah.
00:29:34 Speaker 1
You know, he’s like a real lefty and I think it’s just to him.
00:29:38 Speaker 1
It’s maybe like I don’t know, like a big data thing.
00:29:41 Speaker 1
I’m not like I’m not sure about all of the complexities of his argument, but he’s like, and he just, I think morally thinks he’s like, it’s just bad.
00:29:49 Speaker 1
Yeah, but it is becoming and I have my own thoughts, which I’ve shared on other episodes where I definitely see it as.
00:29:57 Speaker 1
I like to say it’s more nuanced, but in he would say you’re just more centrist or whatever, but it’s like I see there’s good and there’s bad, like there’s implications of everything.
00:30:06 Speaker 1
And I think it’s like there are times where technology, technology always freaks people out.
00:30:11 Speaker 1
People panic and then it just becomes kind of integrated.
00:30:13 Speaker 1
Into our lives.
00:30:14
Life sucks.
00:30:14 Speaker 2
That is true if one looks.
00:30:16 Speaker 2
I’m not an historian of technology, but of course I belong to this broad field.
00:30:20 Speaker 2
I’m a sociologist within this field called Science and Technology studies, that sort of my happier home within social.
00:30:25 Speaker 2
Search and there are many historians of technology that operate in that space and I love their work.
00:30:31 Speaker 2
Yeah.
00:30:32 Speaker 2
And yeah, if one looks historically, of course, there’s all of that.
00:30:35 Speaker 2
Both kind of techno solutionist very positive deterministic rhetoric that, you know, came with every new invention, from Gutenberg’s press to right the telephone router.
00:30:40 Speaker 1
Yeah.
00:30:48 Speaker 2
That was the.
00:30:48 Speaker 1
I’ve house.
00:30:49 Speaker 2
Yes, and but then there’s also the critical, you know, critical social scientists included.
00:30:56 Speaker 2
Um, there’s a there.
00:30:57 Speaker 2
Is that critical determinism, which is it’s all bad or it’s the end of humanity, and the historians do a great job of revealing, through careful historical analysis, how this is in.
00:31:01 Speaker 1
Freight.
00:31:07 Speaker 2
I mean it’s I guess a different version of saying what I said before, it’s a distraction.
00:31:12 Speaker 2
It’s A kind of Presto digitation because it’s often 10.
00:31:14 Speaker 1
Yeah.
00:31:15 Speaker 1
What’s that word you keep saying?
00:31:16 Speaker 1
Can you just spell?
00:31:16 Speaker 2
It’s like light of hand, you know, pressed so not a spell it, but pressed it.
00:31:19 Speaker 1
Oht like, did you?
00:31:21 Speaker 1
OK.
00:31:22 Speaker 1
Digitation.
00:31:24 Speaker 2
Yeah, it’s like moving your hands around like a magician, right, to distract, to distract people from.
00:31:25 Speaker 1
Right.
00:31:26 Speaker 1
Yes.
00:31:27 Speaker 1
Yeah, like sleight of hand, yeah.
00:31:29 Speaker 2
The you know the things that are potentially really going on.
00:31:33 Speaker 2
The truth I mean, which I invoke with caution, but.
00:31:36 Speaker 1
Oh my God.
00:31:36 Speaker 2
And so yeah, I think it’s always yes and right, it’s like um, it’s not one or the other and it’s it’s it’s it’s often both or it’s.
00:31:37 Speaker 1
OK.
00:31:37 Speaker 1
Yes.
00:31:42 Speaker 1
Yes, exactly, exactly.
00:31:48 Speaker 2
Yeah, it’s complex, which makes doing Social Research tricky.
00:31:52 Speaker 2
When I talk to my parents, I think we’ll talk about that.
00:31:54 Speaker 2
You know my very lovely parents who came from very modest means, neither of whom has the university education.
00:31:59 Speaker 2
They’re very bright and definitely have a kind of worldly knowledge.
00:32:02 Speaker 2
Which, but they’re often asking me, like, well, what’s the what’s, you know, just give us the answer or give us the problem.
00:32:09 Speaker 2
And Mom was like, well, let’s systemic, you know.
00:32:12 Speaker 1
Yes.
00:32:13 Speaker 2
But capitalism and it’s, you know.
00:32:13 Speaker 1
Stop, right?
00:32:15 Speaker 1
Yeah, but it’s not just capitalism.
00:32:16 Speaker 2
And.
00:32:16 Speaker 2
So so.
00:32:17 Speaker 1
No, but it’s like human, but it.
00:32:18 Speaker 2
Of course, that just.
00:32:19 Speaker 1
But it’s all of it.
00:32:20 Speaker 1
I mean, and that again is like, why totalitarian?
00:32:24 Speaker 2
Yes.
00:32:24 Speaker 1
I’m to go back.
00:32:24 Speaker 2
That’s so effective.
00:32:24 Speaker 1
It’s like this transcendence of context is that it’s like.
00:32:26 Speaker 2
That’s totally.
00:32:29 Speaker 1
Life.
00:32:29 Speaker 2
In your message for sure.
00:32:30 Speaker 1
You know I.
00:32:31 Speaker 2
And I think, like the robots will kill us is a much easier message than what.
00:32:31 Speaker 1
Was thinking too.
00:32:33 Speaker 1
Right. Or the immigrants will kill us or, you know, or. But I think it’s also why, like Barack, Obama was so much more successful than Hillary Clinton.
00:32:36 Speaker 2
Yes.
00:32:36 Speaker 2
Yeah, yeah, yeah.
00:32:42 Speaker 1
I was talking about this with maybe my husband the other day is like he was really good at keeping things simple like he was really good at being like hope.
00:32:52 Speaker 1
We just need hope and it’s like and Hillary Clinton, who I think you know, there’s all kinds of problems that we could talk about and ways in which is problematic.
00:32:53 Speaker 2
Yeah.
00:33:00 Speaker 1
But she is ******* brilliant, like she is an incredibly intelligent person.
00:33:02 Speaker 2
Yeah.
00:33:05 Speaker 1
And she said, data wonk.
00:33:06 Speaker 1
So she also would be like she would talk about the complexities of things and how we need to understand this nuance.
00:33:10 Speaker 2
Yeah.
00:33:12 Speaker 1
And it’s like nobody cares about your nuance, which is so depressing.
00:33:15
I know.
00:33:15
Quickly.
00:33:16 Speaker 2
It’s still the pacing.
00:33:18 Speaker 2
Well, you know, that’s what my book is kind of about.
00:33:21 Speaker 1
Tell me more.
00:33:21 Speaker 2
OK, so I started the book in 2016, so I’m not only most often the only social scientists or person.
00:33:22 Speaker 1
Tell me more about the book.
00:33:29 Speaker 2
Doing social justice research, I would say I would add the justice in there within the kind of critical AI community.
00:33:37 Speaker 2
But I’m also I happen to study AI in this very particular domain.
00:33:41 Speaker 2
That’s quite understudied globally, which is our use case.
00:33:45 Speaker 2
Sometimes people call it, which is agriculture.
00:33:48 Speaker 1
Right, yeah.
00:33:49 Speaker 2
And like most people do.
00:33:50 Speaker 2
Put those things together and they certainly didn’t in 2016 when I started.
00:33:54 Speaker 1
Yeah.
00:33:55 Speaker 2
And so in 2016, it was like, oh, wait a second.
00:33:57 Speaker 2
I had for a decade, as you said in my bio, I had studied basically public resistance against GMOs and thought very carefully about yeah, Organism.
00:34:04 Speaker 1
Genetically modified organisms, OK, yeah.
00:34:06 Speaker 2
Sorry, yes.
00:34:07 Speaker 2
Um, so I had left the lab bench where I was effectively practicing genetic technique because I found that I was more interested in the kind of public face of the science.
00:34:15 Speaker 2
And like, why are farmers suing these big Agri businesses?
00:34:19 Speaker 2
Why are they?
00:34:20 Speaker 2
Learned, and even though I hadn’t really **** *** lab supervisor Christopher Eckert at Queens University, Um, I felt like when I tried to engage my fellow scientists in those conversations.
00:34:32 Speaker 2
Like, why do you think Members like?
00:34:34 Speaker 2
Why are farmers concerned about these tools and why are people protesting?
00:34:37 Speaker 2
And maybe why should we have labeling?
00:34:39 Speaker 2
Why do some people?
00:34:40 Speaker 2
It was like I had committed a breach of decorum, right?
00:34:42 Speaker 2
It was like.
00:34:43 Speaker 2
Oh, that people just don’t understand the science ohm.
00:34:47 Speaker 1
My God, I just.
00:34:47 Speaker 2
Which is interesting.
00:34:47 Speaker 1
I put a little note that I want to talk about too, and it was again like and we’ll just.
00:34:52 Speaker 1
I just wanna put a pin in this, but I wanna say it out loud and then we can maybe get to it a bit later.
00:34:57 Speaker 1
But I do.
00:34:58 Speaker 1
I was like, just in a chat with some friends about how they’re certain people within the social sciences who suggest that certain among us are, like, not real social scientists because we’re we’re activists, because we are too political or politicized, and that the real, like, like real research is like scientific, which means it’s.
00:35:11 Speaker 2
Cool, that is size.
00:35:18 Speaker 2
Like what? Quantitative?
00:35:20 Speaker 1
I guess I never understand cause I’m like I am the most like one of the most quantitative people in our department.
00:35:25 Speaker 2
Yes, you’re right, it’s true.
00:35:26 Speaker 1
It’s like I, you know, and you have this background in science and it’s like and so it’s.
00:35:28 Speaker 2
Well, you can beat you know.
00:35:32 Speaker 1
I just.
00:35:32 Speaker 1
It’s like it drives me crazy because it’s like, well, that in itself is a political idea.
00:35:38 Speaker 1
But I would love to like its.
00:35:39 Speaker 2
Absolutely.
00:35:40 Speaker 2
It’s that in itself is a political idea, which is what really had me leave the sciences as I started to think, well, like what?
00:35:46 Speaker 1
Yeah.
00:35:46 Speaker 1
Well, so yeah, tell me.
00:35:48 Speaker 2
What’s that’s in charge?
00:35:49 Speaker 2
Like you know who?
00:35:49
Right.
00:35:51 Speaker 2
Why is this a better way of understanding the world, especially when it comes to questions of, well, power?
00:35:57 Speaker 2
Yeah.
00:35:57 Speaker 2
Also like perception and feelings and values and injustice, yeah.
00:36:00 Speaker 1
Right.
00:36:00 Speaker 1
Yes.
00:36:00 Speaker 1
Yeah.
00:36:01 Speaker 1
Yeah, I’m just like, why is it like what you are trying to do?
00:36:04 Speaker 1
And this also reminds me again of of Ivy Bourgealt, who like I think that you and she have a like you’re doing totally different areas of research.
00:36:10 Speaker 1
But similar in that you both like are very fancy.
00:36:13 Speaker 1
Both published a lot.
00:36:14 Speaker 1
You both get a lot of big grants and also do a lot of work with like.
00:36:17 Speaker 1
I don’t know.
00:36:18 Speaker 1
Like stakeholders, like with the government, policymakers like you’re very connected to the sort of wider like public and like she was saying, like, she’s not very popular cause she does stuff, unlike gender, health and the profession.
00:36:30 Speaker 1
So she’s often at committees of like doctors, right?
00:36:33 Speaker 1
And she’s the one person she’s like.
00:36:34 Speaker 1
Well, we need to look at it with a gun brands.
00:36:36 Speaker 1
And they’re like.
00:36:38 Speaker 1
Like you know, no we don’t.
00:36:39 Speaker 2
Yeah.
00:36:40 Speaker 1
And it was like, so interesting because she kept pointing out.
00:36:43 Speaker 1
It’s like, you know, anyone can have a generalized, but you need training to do it.
00:36:47 Speaker 1
And it’s so interesting to me that it’s like these scientists are thinking like, well, we know about the genetics of this and so we therefore are the experts at thinking about the social implications.
00:36:53 Speaker 2
Yeah, something just an expert and then.
00:36:57 Speaker 1
It’s like, no, you need a sociologist for that.
00:36:59 Speaker 1
For the historical application, you need a historian for that.
00:37:02 Speaker 1
And yet, you know, so it’s so, anyway, so you were.
00:37:02 Speaker 2
I think so, yeah.
00:37:03 Speaker 2
Uh huh.
00:37:05 Speaker 1
Like I love.
00:37:06 Speaker 2
You see this everywhere.
00:37:07 Speaker 2
You know, there was a I was on a Council of Canadian academies panel years ago, but there was a panel.
00:37:12 Speaker 2
I don’t.
00:37:13 Speaker 2
Maybe it was two years ago on the Funding Agency.
00:37:16 Speaker 2
There is this like CCA.
00:37:18 Speaker 2
You know this like expert panel and it was on answer.
00:37:21 Speaker 2
So, like the Funding Agency for the natural sciences and engineering and it was all about, you know, these are supposed to be a kind of critical assessment of funding and and it was populated this panel entirely by scientists.
00:37:34 Speaker 2
They didn’t.
00:37:34 Speaker 2
Maria, you know, not one person who had as an area of expertise funding, right or like diversity in the sciences or and I just thought that was so telling.
00:37:46 Speaker 2
Absolutely.
00:37:47 Speaker 2
You know, sometimes I used to go to the.
00:37:50 Speaker 2
Canadian Science Policy Conference I always get confused with the government agency CSP anyway CSPC and it was a lot of that, you know, kind of thinking through these kind of the public face, the kinds of questions that I ask about the sciences and technologies, but it was mostly scientists who yeah, think because you’re an expert in this one particular core domain, a physicist.
00:38:11 Speaker 2
Therefore, you can speak right widely about and it’s a real problem.
00:38:15 Speaker 2
So.
00:38:15 Speaker 2
So this kind of rankled me, and in particular I was like.
00:38:18 Speaker 2
And this may be comes back to our earliest conversation and about doing Social Research.
00:38:22 Speaker 2
I kind of think my fundamental job is to just always be thinking about the limitations of thinking my own, but also like in a discipline, including in the right.
00:38:32 Speaker 2
So I’ve published papers where I’m like, come on, sociologist, you think you know what you’re talking about when you talk about X, right?
00:38:38 Speaker 2
In this case, family farming or biotechnology or I did it about data in my?
00:38:42 Speaker 2
Look right and so I hopefully I’m not not taking that lens on my own work as well, but I think that that’s a fundamental part of my job in doing Social Research.
00:38:52 Speaker 2
So I started to have that intuition on the lab bench.
00:38:55 Speaker 2
Like I’m curious about this and know and there was just a a lack of curiosity or in fact like a kind of inferiority.
00:39:02 Speaker 2
And so I left the lab.
00:39:03 Speaker 2
And I went to Saskatchewan, where farmers were suing these big input supply companies, Monsanto being one who were also suing farmers over these past they were patent disputes over these sea technologies.
00:39:09
Hmm.
00:39:17 Speaker 2
And I started to realize that actually the farm.
00:39:20 Speaker 2
There were quite scientifically educated and more interesting to me, was that their concerns weren’t scientific at all.
00:39:30 Speaker 2
So you know they they weren’t anti science, they actually weren’t concerned at all about the science per se.
00:39:36 Speaker 2
They were concerned about the kind of economic impact of this they would call it pollution of these GMO seeds on their organic property.
00:39:45 Speaker 2
Disabling them from selling in European markets, they were concerned about the implications on, I would say, social cohesion and rural communities, so Monsanto was paying farmers to spy on one another and, you know, no longer were farmers meeting at Coffee Row.
00:39:57
Wow.
00:40:00 Speaker 2
It was like, you know, the biotech farmers versus and it was really fracturing these already fractured because of right, like really the corporate control of agriculture, these rural communities that were just gutted, right?
00:40:13 Speaker 2
Farming is an increasingly tough business and economy.
00:40:16 Speaker 2
Increasingly tough economic proposition because of the power that these really powerful companies.
00:40:20 Speaker 2
Yield um and and but the real.
00:40:24 Speaker 2
The thing that really I started to think was so interesting is it became clear to me they were really concerned these farmers about the models of risk of scientific fact, right, that we’re being used by our Canadian.
00:40:40 Speaker 2
I would say legal system regulatory system, Health Canada and the Canadian Food Inspection Agency scientists.
00:40:46 Speaker 2
To study and define risks, they were like these are dumb ways of assessing right.
00:40:51 Speaker 1
Yeah.
00:40:52 Speaker 2
I’d be that the farmers that did more with more eloquence, but right, so they’re not into not to geek out and go down too much into the weeds, but right the models that are used it was substantial equivalence, right.
00:41:02 Speaker 2
So this GM tomato that’s fundamentally altered right at the at the GMO level of genes or regulatory elements is just being compared chemically to a shelf like a standard conventionally grown tomato for chemical equivalents.
00:41:17 Speaker 2
Or like the long term health effects, the health effects on in the environment and the potential spread of GM material via pollen drift with like the I would say the kind of ecological right, the kind of Rachel Carson kind of model.
00:41:31 Speaker 2
What are the long term risks?
00:41:33 Speaker 2
Um, that was not so.
00:41:35 Speaker 2
They were really raising, I thought these sophisticated questions about different models of scientific risk, which then had would have an impact on who was able to participate in the regulatory system.
00:41:47 Speaker 2
Right.
00:41:47 Speaker 2
Because if it’s just something to be studied in a lab by a.
00:41:50 Speaker 2
Best.
00:41:51 Speaker 2
And then it’s like this is a sound scientific, right?
00:41:53 Speaker 2
Everything’s fine.
00:41:54 Speaker 2
There we go.
00:41:55 Speaker 2
Sound scientific model is good.
00:41:57 Speaker 2
Yeah, it was really creating boundaries around public participation and that’s when I became the kind of social researcher.
00:42:04 Speaker 2
And today, yeah.
00:42:05 Speaker 1
Yeah.
00:42:05 Speaker 1
And just to clarify, sorry, I was turning, you have just for a second, just so you’re saying that the like the kind of Rachel Carson, larger environmental stuff was not being looked at by debt by the government, but that’s what the farmers were saying, yeah.
00:42:12 Speaker 2
Having looked now and it still isn’t, it’s even got narrower and narrower because now we have gene editing which you know that’s also I think if one were to look a rhetorician or an historian of science like the AI conversation, I feel like the regulatory conversation has followed the.
00:42:28 Speaker 2
The I don’t know the scientific conversation when it comes to genetic technologies.
00:42:33 Speaker 2
Or you could call it the bioeconomy where you know so-called gene editing, or like technology editing.
00:42:40 Speaker 2
Being one, it’s thought to be your presumed to be, because this is the I would say, rhetoric coming from the scientific community precise, right?
00:42:48 Speaker 2
It’s a very precise intervention, medically or scientifically.
00:42:52 Speaker 1
What?
00:42:52 Speaker 1
What is?
00:42:52 Speaker 1
It.
00:42:53 Speaker 1
Is it crispy crisper?
00:42:53 Speaker 2
It’s gene editing.
00:42:54 Speaker 2
CRISPR now.
00:42:56 Speaker 2
What is that all goodness?
00:42:57 Speaker 2
Yeah.
00:42:57 Speaker 2
How do I communicate this publicly?
00:42:58 Speaker 1
You can get into the weeds.
00:42:59 Speaker 1
That’s OK.
00:42:59 Speaker 2
Can I get into the weeds a little bit?
00:43:00 Speaker 1
Yeah, totally.
00:43:01 Speaker 2
That’s well, so recombination is like crew.
00:43:04 Speaker 2
Being a new Organism, right?
00:43:06 Speaker 2
By using methods in a laboratory to move genes from one species, often to another, taking an isolating a gene, moving it with regulatory elements that turn genetic function on and off.
00:43:17 Speaker 2
So for example, like fish genes in tomatoes, because fish, you know, they live through winter.
00:43:25 Speaker 2
In the canal right now, right.
00:43:26 Speaker 2
And they just go.
00:43:27 Speaker 2
Sleep.
00:43:27 Speaker 2
They go into a state of torpor and how to fish, not freeze to death right in the canal that freezes over and we all skate on.
00:43:29 Speaker 1
OK. Yeah.
00:43:34 Speaker 2
Well, they have a higher sugar content and there’s the cellular level.
00:43:37 Speaker 1
Oh my God, I totally could be.
00:43:37 Speaker 2
So clever silence then.
00:43:38 Speaker 1
A fish tissue.
00:43:40 Speaker 2
And we could all just go to for the winter.
00:43:42 Speaker 2
Wouldn’t that be nice?
00:43:43 Speaker 2
A bear would be nicer.
00:43:44 Speaker 2
I would like a dent versus the coal canal, but anyway.
00:43:44 Speaker 1
Rats.
00:43:48 Speaker 2
So so, you know, clever scientists have isolated the gene and the element.
00:43:52 Speaker 2
So to create a higher sugar content to prevent strawberries, for example from freaking chipped in a in a reefer, it’s called a freezer truck, right was a long distance.
00:43:56 Speaker 2
As.
00:43:57 Speaker 1
So you can have a long roof throwing.
00:44:01 Speaker 2
Says.
00:44:01 Speaker 2
Oh yeah, and a longer, maybe longer growing season, but I think it’s mostly for the transportation of of food from because.
00:44:08 Speaker 1
Right.
00:44:10 Speaker 2
Places of growing to all over the world.
00:44:13 Speaker 2
And yeah, so.
00:44:15 Speaker 2
Where are we going with this?
00:44:16 Speaker 2
Oh crispers, so the the CRISPR technology is just even more precise.
00:44:21 Speaker 2
It’s like and edit.
00:44:23 Speaker 2
I mean, that’s a metaphor in and of itself, but and and because it’s A kind of more precise intervention.
00:44:29 Speaker 2
I feel like the public conversation has not followed that the regulation is just assumed.
00:44:35 Speaker 2
Oh, this is more precise.
00:44:36 Speaker 2
Therefore, it has less kind of unintended or knock on effects.
00:44:39 Speaker 1
Oh, it’s like they’ve just streamlined it.
00:44:40 Speaker 2
Um, yes, we’ve just streamlined that.
00:44:41 Speaker 1
This is.
00:44:42 Speaker 1
Inherently better.
00:44:43 Speaker 2
Yeah, I don’t know.
00:44:44 Speaker 2
Again, I haven’t studied this systematically.
00:44:46 Speaker 2
But I feel like that’s kind of happened and in fact that I referenced in my book coming back to the book, this great article by a woman named Ann.
00:44:57 Speaker 2
Morning.
00:44:57 Speaker 2
I think it’s from 2003.
00:44:58 Speaker 2
It’s really old where she talks about scientific racism and my book does not deal with race or racial injustice and technology because I’m not the person to do that.
00:45:09 Speaker 2
But what I really like is the heuristic that she uses in this article where she says, you know.
00:45:14 Speaker 2
We many of us know about the story of, say, the perpetuation of scientific racism through old school sciences, and I put that in quotations like phrenology, right or like what?
00:45:22 Speaker 1
Age.
00:45:25 Speaker 1
Yeah.
00:45:27 Speaker 2
I mean, even eugenics.
00:45:28 Speaker 1
The entire the census, like the entire the entire creation of science, was to justify like racial categories.
00:45:30 Speaker 2
Yeah.
00:45:30 Speaker 2
Since it.
00:45:31 Speaker 2
Yeah, yeah.
00:45:31 Speaker 2
Yeah, exactly.
00:45:33 Speaker 2
Yeah, right.
00:45:34 Speaker 2
Or like it’s a bit one, it’s a bit easier to see or has been easier to.
00:45:39 Speaker 2
Point.
00:45:39 Speaker 1
Yes.
00:45:40 Speaker 2
Out um, but she then talks about the move toward like.
00:45:44 Speaker 2
Biotechnologies as increasing kind of sophistication of the technology, where the critical conversation hasn’t really followed or necessarily been able to follow because the science is so complex and because the tools right are complex, proprietary like and so it’s a bit it kind of goes under it, it becomes insidious, the scientific racism and really like that.
00:46:05 Speaker 1
Oh, totally.
00:46:07 Speaker 2
Heuristic for thinking through.
00:46:10 Speaker 2
What kind of happens when we start to talk about things like big data or AI and again comes back to who’s in charge of the critical conversation or who’s steering that bus, right?
00:46:22 Speaker 2
If the science or the technique is so complex, then it’s easier, I think, for those who sit in positions of power, not just the technologists but the technologists who happen to work for really powerful and very wealthy corporations.
00:46:39 Speaker 2
This to say, ooh, we got this right now and I love the just even the coming back to that point I made earlier about the open letters.
00:46:47 Speaker 2
Right.
00:46:47 Speaker 2
Like the Sam Altman, the open, an open letter or the white, there was like white papers, and I even just love the the language of the open letter.
00:46:56 Speaker 2
The White Paper, like a government white paper, was very much like we’re just going to.
00:47:00 Speaker 2
Tell you about something really important, but like a bunch of it is going to be redacted.
00:47:04 Speaker 2
We’re just going to tell you what you.
00:47:05 Speaker 2
Need to know and nothing more, right? Right.
00:47:07 Speaker 2
So I just think it’s so poignant that they call these open letters or it really shows that a big part of the conversation is closed, necessarily close to who has the key question spread filling open.
00:47:14 Speaker 1
OHP interesting, right?
00:47:16 Speaker 1
Right.
00:47:16 Speaker 1
Like if you need an open letter like that implies, yeah.
00:47:18 Speaker 2
What’s closed?
00:47:19 Speaker 1
What?
00:47:20 Speaker 1
What?
00:47:20
Right.
00:47:20 Speaker 1
What’s happening in the clothes letters?
00:47:21 Speaker 2
There’s a real like, we got this behind closed doors.
00:47:24 Speaker 2
I think approach.
00:47:26 Speaker 2
Hey.
00:47:26 Speaker 1
And you can see how effective that is.
00:47:27 Speaker 1
Well, I was.
00:47:28 Speaker 1
I was just gonna say too, like one person.
00:47:30 Speaker 1
I just signed in my social US Class A while it’s a little bit older now, but rued Benjamin.
00:47:34 Speaker 2
So so great, yeah.
00:47:35 Speaker 1
I think I’ve mentioned her like she does a lot of stuff and I just remember about.
00:47:39 Speaker 1
But racism within coding within technology, all that kind of stuff.
00:47:43 Speaker 1
And I I just remember attended again in class and which is the only time I read.
00:47:48 Speaker 1
I’m sad to admit because it’s like right.
00:47:52 Speaker 2
Really teaching it.
00:47:52 Speaker 1
It’s like I assigned books exactly right.
00:47:53 Speaker 2
Keeps us current exact goodness for students as students.
00:47:56 Speaker 1
Exactly.
00:47:57 Speaker 2
Undergrads, yeah.
00:47:57 Speaker 1
Oh my God, totally gets me to actually like stay more current anyway.
00:48:00 Speaker 1
But she talks a lot about the ways in which, like and again, it’s not that there’s, like an evil person who, like, it’s not that it’s.
00:48:08 Speaker 1
Like you know, some old racist redneck.
00:48:10 Speaker 1
It’s just some dudes in Silicon Valley who aren’t trained to think about these things, you know, and they’re just applying their own.
00:48:11 Speaker 2
No.
00:48:18 Speaker 1
Yeah.
00:48:19 Speaker 1
Perspective their own life experiences and the consequences are enormous for others, but it’s like, and I think people get so caught up in, like, oh, they didn’t mean to or let you know, like, I’ll hear people be like, of course they did really soon.
00:48:30 Speaker 1
And it’s like, well, **** yeah, they’re smart.
00:48:32 Speaker 1
But it doesn’t mean that they know everything about everything, you know.
00:48:35 Speaker 2
And I think it comes back down to, I guess, these fundamental kind of frameworks or presuppositions about what is expertise and right where are the boundaries between science and society, which is the title of my.
00:48:43 Speaker 1
Yes.
00:48:47 Speaker 2
There.
00:48:47 Speaker 2
Yeah, right.
00:48:49 Speaker 2
Because yes, they didn’t mean to, but but they operate as.
00:48:56 Speaker 2
If.
00:48:57 Speaker 2
They’d know everything, right?
00:48:58 Speaker 2
They’re not, they’re not trained to think about ideology like social researcher would be right or about limitations of knowledge.
00:49:05 Speaker 2
And in fact, given the received messaging, yeah, especially for like white men, right?
00:49:11
I’m.
00:49:12 Speaker 2
Not to get into the gender aspect of it, but around technology and right, there’s a there are not really meant to think about or.
00:49:15 Speaker 1
Oh God.
00:49:19 Speaker 2
There’s no checks and balances on.
00:49:20 Speaker 1
How?
00:49:20 Speaker 2
And on their.
00:49:23 Speaker 1
Power. And it’s also like I was saying before we started recording, I was talking about my idea of a practice of humility, which I talk at my book.
00:49:30 Speaker 1
But to me like what drives so much of this is hubris like it is just the hubris of I need to know, I need to be an extraordinary.
00:49:30 Speaker 2
Right.
00:49:34 Speaker 2
Yes.
00:49:36 Speaker 2
Daily.
00:49:38 Speaker 1
And I think that’s partly like what I wanted to make this podcast is like, I want the realness there.
00:49:43 Speaker 1
Like I want people to see like how?
00:49:45 Speaker 1
Heart smart, my friends are.
00:49:46 Speaker 1
How?
00:49:48 Speaker 1
It’s like how, like how much thought and how much work goes into it, but also like I intentionally ask questions of like I don’t get it like it it’s sometimes like my impulse can be I wanna be smart.
00:49:49
No.
00:49:59 Speaker 1
Like I want people to think I’m smart.
00:50:00 Speaker 2
Yeah.
00:50:01 Speaker 1
Even you?
00:50:01 Speaker 2
Review as smart as an authoritative but there’s a politics to that, absolutely.
00:50:03 Speaker 1
Right.
00:50:04 Speaker 1
Yeah.
00:50:05 Speaker 1
And it’s just like and so to me, it’s like I’m trying to like model humility of being like, explain that to me.
00:50:09 Speaker 2
And.
00:50:09 Speaker 2
That.
00:50:10 Speaker 1
I don’t understand this thing.
00:50:12 Speaker 1
I don’t know because it is so hard for so many of us, and I also am working on this idea of like stupid ISM.
00:50:19 Speaker 1
It’s like I think a thing where it.
00:50:19 Speaker 2
I.
00:50:19 Speaker 2
That.
00:50:20 Speaker 1
Like we all are stupid about certain things, but it’s like so undoing emotionally for people to think that we’re stupid about something, that we all pretend to be smarter about all kinds of things and.
00:50:29 Speaker 2
Absolutely.
00:50:31 Speaker 2
That we perpetuate this system.
00:50:33 Speaker 2
This is getting into the unpolished academic I think.
00:50:37 Speaker 2
I think you’re exactly right.
00:50:38 Speaker 2
You know the no one really talks about how the sausage is made.
00:50:42 Speaker 2
The book or an article or a lecture or in the Academy we all perform and then you know, give our students these polished objects.
00:50:47 Speaker 1
Yeah.
00:50:53 Speaker 2
But they’re still such messiness.
00:50:53 Speaker 1
Right.
00:50:54 Speaker 2
Ah, or unpolished behind the scenes and the.
00:50:56 Speaker 1
Yeah.
00:50:58 Speaker 2
Including the emotional stuff, right?
00:51:00 Speaker 1
Totally.
00:51:00 Speaker 2
Foremost, and we don’t model that for our students because it’s not really a safe space for modeling that kind of.
00:51:04 Speaker 1
No, and it’s embarrassing.
00:51:06 Speaker 2
We’re supposed to perform, you know, I have this such bright um, PhD student whose graduated as an A postdoc and recently interviewed for a position that I think she ought to have got.
00:51:18 Speaker 2
And you know the feedback that she got when she after the campus visit was well, you just weren’t confident enough.
00:51:25 Speaker 1
Oh my God.
00:51:26 Speaker 2
And I thought that just I was so depressed for days.
00:51:29 Speaker 2
I just thought, well, you know, that’s basically subtext for.
00:51:33 Speaker 2
You just didn’t perform white masculinity.
00:51:35 Speaker 1
Yes, right.
00:51:36 Speaker 1
It’s like you work too much.
00:51:36 Speaker 2
And so you just weren’t authoritative enough.
00:51:37 Speaker 1
You were a girl.
00:51:39 Speaker 2
You didn’t claim your expertise defined in this particular way, because I think there’s a way to do expertise right to be comfortable, to be an expert is to be comfortable with the.
00:51:51 Speaker 2
Limits of your expertise.
00:51:51 Speaker 1
Yeah, cause you know it and oh, my
00:51:53 Speaker 2
Right, exactly.
00:51:54 Speaker 2
And to say, you know, I do to be comfortable enough to say, I don’t know or look at all of the work that I put into this and look at how I just like had to lay in bed for a whole day and eat cookies because I just couldn’t motivate myself to work on the.
00:52:03 Speaker 1
Ohm.
00:52:03 Speaker 1
My God needs right?
00:52:06 Speaker 2
Anymore or I don’t know.
00:52:08 Speaker 1
Yeah.
00:52:08 Speaker 1
Or I spent all day yesterday instead of preparing for this, like getting into a fight with my husband cause we couldn’t.
00:52:13 Speaker 2
Rained.
00:52:14 Speaker 1
It took us 2 hours to get our kid to school because, you know, like kid of Prof is like I hate school.
00:52:17
Yeah.
00:52:19 Speaker 1
I don’t want to go, you know, and then then it’s like it’s your fault.
00:52:20 Speaker 2
Ohhhhh no.
00:52:22 Speaker 1
No, it’s your fault.
00:52:23 Speaker 1
And like it it just like life lifes, you know, it just happens all the time and I don’t.
00:52:25 Speaker 2
Live.
00:52:26 Speaker 2
Yeah, totally.
00:52:29 Speaker 2
But we.
00:52:29 Speaker 1
Have.
00:52:29 Speaker 2
Perform that in the Academy.
00:52:31 Speaker 1
No.
00:52:31 Speaker 2
So yeah, that’s this like kind of pet project.
00:52:33 Speaker 2
You were gonna talk to me later about it, but I have which?
00:52:35 Speaker 1
Yeah.
00:52:35 Speaker 2
Well, I don’t even know.
00:52:36 Speaker 2
I don’t think it’s a project.
00:52:38 Speaker 2
I don’t think it’s actually Social Research.
00:52:40 Speaker 2
I think it might be a movement, but I feel like it’s important within the Academy because.
00:52:41 Speaker 1
Yeah.
00:52:45 Speaker 2
Is and it comes to back to my, I guess personal history which is, you know, I come to academics as a first generation scholar and, you know, with an immigrant family, my mother’s family and without a lot of opportunity.
00:53:02 Speaker 2
And I remember as an undergrad feeling being marked by that, like on my body, you know, the way that I sat in class and not understanding subtext.
00:53:08
Hmm yeah.
00:53:12 Speaker 2
And in the sciences, it was less tricky, but I didn’t.
00:53:15 Speaker 1
How so?
00:53:16 Speaker 1
What do you mean?
00:53:16 Speaker 2
Well, because it’s a bit more black and white, right?
00:53:19 Speaker 2
You taking a math class and you it’s, but you know I I took all my electives and the arts and I took classics.
00:53:24
Yeah.
00:53:25 Speaker 2
And and and and I had never read the classics right or I mean, I had read some literature, of course, because of high school curriculum.
00:53:36 Speaker 2
But it was just a lot of catching up in not just the reading and the learning.
00:53:42 Speaker 2
The content based learning but also in how to behave right.
00:53:45 Speaker 1
And how to talk?
00:53:46 Speaker 2
How to talk?
00:53:47 Speaker 2
Yeah, like and I and it’s so interesting.
00:53:49 Speaker 2
When I left leg, bench science and I started this degree in sociology of technology.
00:53:55 Speaker 2
Be I got a job as a research assistant doing interviews with actually cattle farmers about climate change, which was interesting because this was like the early oughts and nobody was no cattle farm in Alberta was talking about climate change per se.
00:54:06
Wow.
00:54:07 Speaker 1
Yeah, yeah, brain.
00:54:07 Speaker 2
They were living through it and of course, but which is an interesting sort of side note, but it was the first time I’d ever heard myself recorded.
00:54:15 Speaker 2
And I remember having this weird kind of like uncanny valley or whatever it’s called.
00:54:20 Speaker 2
Like it was me, but it was so not me.
00:54:23 Speaker 2
I was articulate and I just hated it right.
00:54:25 Speaker 1
Hmm, you’re doing a good job.
00:54:25 Speaker 2
I was.
00:54:26 Speaker 2
I was clearly like performing.
00:54:28 Speaker 2
I was doing a good job performing.
00:54:31 Speaker 2
Expert.
00:54:31 Speaker 1
Yeah.
00:54:32 Speaker 2
And so, yeah, I I guess, you know, I’ve started to notice or I started to think early on when I was finishing my PhD and I was pregnant with my son and I was bussing myself all over southwestern Ontario to teach classes on contract and having to, like, pump my breasts on the Greyhound bus.
00:54:49 Speaker 2
And just thinking about.
00:54:50
Yeah.
00:54:52 Speaker 2
The place of I don’t know women in the Academy, the balancing of, as you said before, life, what I might call it, like reproduction and yeah, academic production gender but also class because we’ve kind of stopped in lots of ways talking about class um and yeah.
00:55:09 Speaker 2
So partly the unpolished academic is, I don’t know what, but.
00:55:12 Speaker 2
You.
00:55:13 Speaker 2
Is it a forum?
00:55:14 Speaker 2
Is it a movement?
00:55:15 Speaker 2
I want us just keep talking about and perform modeling the making of the sausage.
00:55:19 Speaker 1
Yeah.
00:55:20 Speaker 2
What you want this podcast to be?
00:55:21 Speaker 1
Yeah, exactly.
00:55:22 Speaker 2
But maybe it’s just this podcast and I can just support your podcast.
00:55:23 Speaker 1
But what I also yeah.
00:55:25 Speaker 2
But you’re not going to rename it the unpolished academic, but.
00:55:27 Speaker 1
No, but that is.
00:55:29 Speaker 1
I’m like I I put in the notes or whatever here.
00:55:31 Speaker 1
I’m like that could be written on my tombstone.
00:55:33 Speaker 1
I feel like and yet.
00:55:34 Speaker 1
What is interesting is I’m the child of two academics like I am the child of people who were like, you know, who were professors like.
00:55:42 Speaker 1
So I my mom was a grad student when I was in elementary school, and my dad was already a professor like.
00:55:47 Speaker 1
And yet I still like and I can see there’s certain things, and it always, like, makes me cringe a bit when I have, like, a first generation grad student and other like I’ve had committee members say things like, well, they they just don’t have the gravitas.
00:56:03 Speaker 1
And it’s like it.
00:56:03 Speaker 2
Yeah.
00:56:04 Speaker 1
It makes me so enraged.
00:56:05 Speaker 1
And I’m like how the like, how do you have the gravity like I have the Grove?
00:56:08 Speaker 1
Has because that’s my habit is to borrow from Bourne Holiday.
00:56:12 Speaker 1
It’s like I was born into it.
00:56:14 Speaker 1
I didn’t work for that.
00:56:15 Speaker 1
I didn’t do anything for that like I just it happens to be I, you know, by whatever role of the dice with the sperm and the egg.
00:56:17 Speaker 2
Yeah.
00:56:24 Speaker 1
I came to be the.
00:56:25 Speaker 1
But like, it isn’t like it’s such a like, it’s so classist and it is so.
00:56:31 Speaker 1
But the other thing I just keep thinking about it like going back to your book and this idea of who is the expert.
00:56:36 Speaker 1
It’s like the farm.
00:56:37 Speaker 1
Others are gonna save, like these organic farmers.
00:56:40 Speaker 1
From what you’re saying, it’s like they’re gonna save us from, like, environmental destruction in some of, you know, not to like overinflate it or whatever.
00:56:44 Speaker 2
That’s OK.
00:56:46 Speaker 1
But they have a kind of expertise and it’s like and.
00:56:48
Absolutely.
00:56:50 Speaker 1
And this is where it’s like for me.
00:56:51 Speaker 1
Humility is not about even being humble, like it’s not about like you know.
00:56:57 Speaker 1
In a Christian way, like hiding your light, people sell, which I.
00:57:00 Speaker 1
You know from the old musical godspell some it’s not about, like, hiding what we’re experts at.
00:57:00 Speaker 2
Yeah, yeah, yeah.
00:57:07 Speaker 2
No, not at all.
00:57:07 Speaker 1
It’s it’s about like owning our expertise.
00:57:10 Speaker 1
It’s like I am an expert at whatever.
00:57:12 Speaker 1
Yes.
00:57:13 Speaker 1
And I I know stuff, but I don’t know everything.
00:57:16 Speaker 1
Yeah.
00:57:16 Speaker 1
And I I have a sociological lens.
00:57:18 Speaker 1
And so I I personally like I was an expert guest on a thing about AI, and I was like, I’m not an.
00:57:20 Speaker 2
Absolutely.
00:57:25 Speaker 1
Like I don’t know anything, but I have so many opinions and it could be so tempting cause we all have thoughts about these this.
00:57:30 Speaker 1
Years, but good research, good being a scholar to me is about well or my friend Maggie.
00:57:30 Speaker 2
Yeah.
00:57:37 Speaker 1
She was talking about, like, conspiracy theorists.
00:57:39 Speaker 1
It’s like this desire to, like, think that you’re really smart.
00:57:42 Speaker 1
And so you create this very convoluted kind of a theory to like explain things and it’s so tempting because it makes you feel like, oh, I have this information that nobody else has.
00:57:52 Speaker 1
I’m smart.
00:57:52 Speaker 1
See how smart I am.
00:57:53 Speaker 1
And it’s like, but what you’re not doing.
00:57:56 Speaker 1
Is your not getting challenged?
00:57:58 Speaker 1
It’s like you’re not questioning yourself.
00:58:00 Speaker 1
You’re not, and so it’s like there’s this balance that we have to engage in of, like, not like, if you don’t speak up about this, if you’re like, oh, I’m not an expert and like, I’m not kind.
00:58:10 Speaker 1
I’m just a girl.
00:58:11 Speaker 1
Like, we’re not gonna get anywhere, but if we’re also like, oh, yeah, I know everything about everything.
00:58:16 Speaker 1
We also were gonna go into really scary direction, you know.
00:58:20 Speaker 2
Yeah, that’s.
00:58:20 Speaker 1
So it’s like this, not rigorous.
00:58:22 Speaker 2
No, no, it’s true.
00:58:24 Speaker 2
It is a balance.
00:58:25 Speaker 2
I mean, it’s like playing within the system.
00:58:27 Speaker 2
And also, yeah, I’m not against claiming expertise or being confident in ones assertions in ones knowledge gained through hard work and systematic thinking and intelligence.
00:58:38
It’s just a.
00:58:42 Speaker 2
But but yeah, I mean I I find it difficult the demands on academics to perform.
00:58:42 Speaker 1
Yeah.
00:58:51 Speaker 2
It’s not that there is an expertise, but that, like objectivity, authority, expertise.
00:58:56 Speaker 2
Look, I mean, we’re kind of told in these very subtle ways, right?
00:59:00 Speaker 2
You said habitus before, but that in order to be read as authoritative or expert, or 1 needs to perform.
00:59:06 Speaker 2
I mean, I really think it’s white masculinity, right?
00:59:09 Speaker 2
Yeah.
00:59:10 Speaker 2
So I think there are different ways to be.
00:59:13 Speaker 2
Yeah, yeah, less polished.
00:59:16 Speaker 2
More honest, maybe more open right to deliberation and other points of view to challenge and yeah.
00:59:28 Speaker 2
Yeah, I don’t know.
00:59:29 Speaker 2
There’s a kind of more.
00:59:31 Speaker 2
Yeah.
00:59:32 Speaker 2
So.
00:59:32 Speaker 1
Yeah, yeah.
00:59:32 Speaker 2
So that’s the unpolished academic and.
00:59:33 Speaker 1
I mean, yeah, I love it.
00:59:35 Speaker 1
And that’s and again it’s like like just to I already said it, but I’m just gonna say it again.
00:59:39 Speaker 1
It’s like going back to the farmers.
00:59:40 Speaker 1
It’s like if we don’t do this, it’s not just like, oh, it’s nice.
00:59:43 Speaker 1
It’s not just like we are allowed to live more authentic lives as individuals.
00:59:47 Speaker 1
It’s also that if we all can be like to me, humility is honesty.
00:59:51 Speaker 1
Like that’s it.
00:59:52 Speaker 1
It’s just.
00:59:52 Speaker 1
And honest, it allows us to then like we can tap into knowledge that that that we would otherwise be too afraid to tap into.
01:00:03 Speaker 1
Like to me this is like it you know.
01:00:03
Yeah.
01:00:05 Speaker 2
Yeah, it’s rigger.
01:00:06 Speaker 1
Yes. Yeah.
01:00:06 Speaker 2
Yeah, in a way.
01:00:07 Speaker 2
So I mean in a part in a way that’s sort of how I started the journey on AI in agriculture is was through first looking at GMO genetically modified organisms.
01:00:17 Speaker 2
And I was really interested in these legal disputes between farmers.
01:00:20 Speaker 2
Person and I became interested in for my masters.
01:00:25 Speaker 2
Why these groups of farmers without any resources were going to bat against these large, powerful like suited up, you know, agribusinesses like Monsanto.
01:00:35 Speaker 2
Why and I and I made sense of it at the time, which is why I found my masters in sociology is a kind of social activism, right?
01:00:42 Speaker 2
There was like nowhere but the courts to gain visibility to the kinds of issues that they were concerned about.
01:00:47 Speaker 2
Again, the non-technical issues, they were almost just using these lawsuits as a spectacle.
01:00:47 Speaker 1
Interesting.
01:00:54 Speaker 1
Interesting, yeah.
01:00:54 Speaker 2
Um, yeah.
01:00:55 Speaker 2
And and, but then for my PhD, I continued to focus on these lawsuits.
01:01:01 Speaker 2
Partly because the lawsuits continued all their way, all the way up through to the Supreme Court level like this was not going away.
01:01:08 Speaker 2
One of them in particular was highly visible.
01:01:10 Speaker 2
The Percy Schmeiser case.
01:01:11 Speaker 2
In fact, I think there was a movie made out of it.
01:01:13 Speaker 2
Christopher Walken played Percy Schmeiser the farmer, and it was Monsanto versus Schmeiser.
01:01:19 Speaker 2
And so the lawsuits sort of continued and I followed them, but it was also that I became really interested in this lack of humility or this surety, this the way that the legal actors were defining expertise and risk, right.
01:01:33 Speaker 2
Because I saw spending time with the farmers, I knew the kinds of factor that they were submitting to the court and I happened to know the I came to know the farmers through the ethnographic study I’d done for my Masters, and I knew, as you said, that they have a kind of expertise that I would call scientific, right, experiential knowledge of the land.
01:01:52 Speaker 2
They intervene right at science.
01:01:54 Speaker 2
It’s systematic intervention into the natural.
01:01:56 Speaker 2
World and there in the case of the SCHMEISER trial.
01:01:59 Speaker 2
So interesting to me.
01:02:00 Speaker 2
You know, Schmeiser had his neighbor testify that this GM corn actually flew off the back of a truck and landed on Schmeiser’s property.
01:02:08 Speaker 2
Schmeiser had detailed photographic evidence that these GM seeds were like growing on the perimeter of his property.
01:02:15 Speaker 2
And that’s how he explained the so-called.
01:02:18 Speaker 2
Contamination or the the the company said he was growing it illegally, that he hadn’t paid them and he said no, no, no.
01:02:22
Ohhhhh.
01:02:25 Speaker 2
You’ve polluted my conventional seed, and so it was this, like, basically battle over that right?
01:02:27
No.
01:02:31 Speaker 1
Wow.
01:02:31 Speaker 2
And I think there were.
01:02:32 Speaker 2
I ended up knowing he’s actually a colleague at the University of Ottawa now on that case.
01:02:37 Speaker 2
There were problems with the the facts or his the things he submitted to the court Schmeiser.
01:02:42 Speaker 2
But what I found so interesting was that the legal actors, the way they were defining expertise in drawing boundaries.
01:02:48 Speaker 2
So, for example, that testimony, Schmeiser’s neighbor Boyster Meyer.
01:02:52 Speaker 2
Is his name.
01:02:53 Speaker 2
They basically dismissed that.
01:02:54 Speaker 2
And yet Monsanto hired this aeronautic engineer to talk about the kind of average wind speed that a seed would fall.
01:03:02 Speaker 2
And they called that expertise.
01:03:05 Speaker 2
Like they that language that the that the legal reasoner used was expert versus Boyer’s testimony opinion.
01:03:11 Speaker 1
Wow.
01:03:12 Speaker 2
Which, like those kinds of syntactic, you know, like dichotomies and expert versus opinion.
01:03:18 Speaker 2
That’s how I became interested in language and, well, the kind of thinking carefully about or taking a humble position on what we know, right, the limitations of our knowledge.
01:03:28 Speaker 2
And and how that then plays into who can participate in defining our future, the future of food and and that really then I, you know, make my way into data and AI because I started to follow.
01:03:32 Speaker 1
Yeah.
01:03:40 Speaker 2
I started to think about power and language and and these agribusinesses.
01:03:45 Speaker 2
Um and I started to follow their purchasing habits and their investment.
01:03:52 Speaker 2
Like I looking at, you know, financial documents for these big firms and what they were doing with research and development around these seeds because they these are patented objects, these seed systems and in like 2015, it became clear to me.
01:04:05 Speaker 2
They their seed patents were coming up on the major technologies, their Roundup technologies for Monsanto in 2022.
01:04:12 Speaker 2
But it didn’t seem to me the company was investing in in future genetic research.
01:04:16 Speaker 2
To the extent that it was investing in analytics and data companies, they were buying up exactly.
01:04:20
Oh, why?
01:04:22 Speaker 2
So it started out as just this curiosity.
01:04:24 Speaker 2
Like, wait a second is Monsanto an analytics company?
01:04:28 Speaker 2
And like, where do they get their data from and what are they doing with data?
01:04:31 Speaker 2
And then it was, you know, this the public conversation around data.
01:04:34 Speaker 2
And I was thinking, nobody’s paying attention.
01:04:36 Speaker 2
And to companies like Monsanto to Bayer.
01:04:40 Speaker 2
To John Deere.
01:04:41 Speaker 2
Yeah, as data companies and where who, where are they getting data from?
01:04:43 Speaker 1
Yeah.
01:04:46 Speaker 2
What are?
01:04:47 Speaker 2
What are they doing with big data sets?
01:04:47 Speaker 1
We gotta you also said there, but OK.
01:04:50 Speaker 2
Yeah.
01:04:50 Speaker 2
And what and like how do I find out potentially how these data are being monetized?
01:04:56 Speaker 2
They must be part of it.
01:04:57 Speaker 2
I figured a future revenue stream for the companies.
01:04:59 Speaker 2
If there was such investment.
01:05:00 Speaker 2
Made in data and so that’s how the book started.
01:05:05 Speaker 1
So what did you find like so?
01:05:07 Speaker 2
Well, I mean the first thing I found was that it was really difficult to find anything, right, because the data sets that it’s basically John Deere tractors and other tractors too.
01:05:08
No.
01:05:18 Speaker 2
But John Deere holds the majority market and at least North America, and also countries like Brazil.
01:05:25 Speaker 2
And because they hold the Mart, they just dominate the market.
01:05:29 Speaker 2
Hmm, for machinery, now every John Deere tractor is basically a data collecting tractor.
01:05:36 Speaker 2
It’s a, it’s licensed, actually it’s not owned.
01:05:40 Speaker 2
It’s like a cell phone and it has embeds hundreds of sensors that collect data passively from farms, and these data are sent to cloud based infrastructure the company.
01:05:47 Speaker 2
Makes the data proprietary and actually we know that they’ve legally transferred data they do to input supply companies like Monsanto and but so we I knew this but like following the data getting access to data sets the farmers can’t get access.
01:06:01 Speaker 2
I couldn’t, as a critical researcher, gain access to the data sets.
01:06:04 Speaker 1
Yeah.
01:06:05 Speaker 2
I tried to take it interviews but I had published enough.
01:06:08 Speaker 2
OK, on power and GMOs that you know, people within businesses didn’t want to speak to me.
01:06:14 Speaker 2
So I mostly got the stakeholder relations person who gave me like boilerplate, which was actually really fascinating because I came to see similar messaging across the companies from the stakeholder relations, which is a finding in and of itself.
01:06:23 Speaker 1
Right, yeah.
01:06:26 Speaker 2
Um and then, but then often I would get a retired scientist who would say, OK, I can speak to you because I’m no longer subject and nondisclosure.
01:06:34 Speaker 2
So I started to see actually that it was not just copyright law, intellectual property law that was preventing me from following the data and figuring out who.
01:06:43 Speaker 2
Is, you know, gaining economically or other wise from the data, but also trade secrecy law.
01:06:50
Yeah.
01:06:50 Speaker 1
Wow.
01:06:51 Speaker 2
Yeah.
01:06:52 Speaker 2
So it was really hard to tell, but which is a finding in and of itself.
01:06:54 Speaker 2
It was like, oh wait a second, at least in terms of my interaction with regulatory actors, we need to be thinking about transparency and openness and right, I mean, from the very for the basic reason that these companies are collecting the data.
01:07:07 Speaker 2
And then companies like Monsanto run the farm data through proprietary algorithms.
01:07:11 Speaker 2
And then sell advice to farmers.
01:07:13 Speaker 2
Now it’s called precision agriculture about how best to farm.
01:07:18 Speaker 2
If we can’t validate the algorithm, but how do we know it’s actually doing what it says it’s going to do?
01:07:21 Speaker 1
Right.
01:07:23 Speaker 2
And there’s a clear vested interest if you look at I have an article from big data in society that tries to look at what the algorithms do right.
01:07:33 Speaker 2
And if you look, they only recommend chemicals, for example, within the same ecosystem of products from the same company, like how do you from a from a basic public good perspective, we need transparency or at least we need access for critical researchers or those who might validate what these algorithms say they’re doing for farmers.
01:07:38 Speaker 1
Wow, I get it now.
01:07:48
Ohio.
01:07:52 Speaker 1
Oh my God.
01:07:53 Speaker 2
But also from like a broader kind of social justice perspective.
01:07:55 Speaker 1
Ohm, my God.
01:07:56 Speaker 2
So so.
01:07:56 Speaker 2
I mean, I couldn’t.
01:07:57 Speaker 2
I couldn’t figure it out, but then the story I mean, the reason the book is called the Immaculate Conception of data as opposed to like power.
01:08:04 Speaker 2
I think my book proposal was like big data, big power and egg or something, and I saw I saw that like there were ways to infer that these companies are doing things with data that reinforce their already completely inequitable market power, which then leads to lobbying and helps them define all of agriculture, not just for farmers but also consumers and the environment.
01:08:07
Yeah.
01:08:27 Speaker 2
As you said before, the non human environment, but I started to notice in my data set that.
01:08:35 Speaker 2
Something else?
01:08:36 Speaker 2
Something bigger I think even than the whole like monetary power via data, which was that everyone even I found like critical activists, those of us who were in this space and 20/16/17 and just kind of everywhere, every convention I went.
01:08:55 Speaker 2
To there was a way of talking that.
01:08:57 Speaker 2
About.
01:08:58 Speaker 2
And AI.
01:09:00 Speaker 2
That was shared and that was Immaculate.
01:09:04 Speaker 2
It was like basically, you know, using phraseology like data-driven and the AI system knows and AI is smart and feeding and all of the which you might just think, oh, it’s just metaphors.
01:09:15 Speaker 2
Oh, it’s just language, but I started to see that it was a kind of useful Speaking of rhetoric, rhetorical tool for people who are trying to win.
01:09:24 Speaker 2
And right resources, whether it was, you know, government grants or getting farmers to buy things, or convincing a social scientists that this is the future of agriculture.
01:09:37 Speaker 2
So there was this kind of futurism like AI are the future and and the AI tool itself is driving the bus.
01:09:43 Speaker 2
It’s going to get us there inexorably, and it was that way of talking, as if AI or data are immaculately conceived and not a product of human intervention that I started to see as the real finding in the book.
01:09:57 Speaker 2
And obviously then it it’s the framework that gives the book its title and the real problem.
01:10:02 Speaker 2
Coming back to what I see as the main point of doing Social Research because they started to see that it’s a framework that exceeds agriculture.
01:10:09 Speaker 2
It’s everywhere.
01:10:10 Speaker 2
Yeah, people talk about data that way everywhere.
01:10:10 Speaker 1
Oh, tots, yes.
01:10:13 Speaker 2
And and you know, I started to think, well, what happens.
01:10:15 Speaker 2
What are the consequences for the actual unsavory things?
01:10:19 Speaker 2
Or, you know, things that are being done with data.
01:10:23 Speaker 2
Well, if we talk about data as immaculately conceived, then we’re not really alive to right.
01:10:30 Speaker 2
We’re not as alert to sorry the.
01:10:32 Speaker 2
Be power and the politics and the actual decisions being made.
01:10:37 Speaker 1
Yeah.
01:10:37 Speaker 2
By whom?
01:10:38 Speaker 2
Right.
01:10:38 Speaker 2
We can’t ask those precise questions about who collected these data.
01:10:42 Speaker 2
Yeah.
01:10:42 Speaker 2
Who’s monetizing them?
01:10:44 Speaker 1
No.
01:10:44 Speaker 2
Who’s made powerful or disempowered by the use of these data?
01:10:47 Speaker 2
Or this algorithm and it’s.
01:10:49 Speaker 1
Oh, totally no.
01:10:49 Speaker 2
So there was a real politic.
01:10:50 Speaker 2
There’s a politics to that, that framework.
01:10:52 Speaker 1
And it’s.
01:10:54 Speaker 1
Yeah.
01:10:54 Speaker 1
And it’s ohk.
01:10:55 Speaker 1
My God, there’s like so many thoughts I have and like it.
01:10:58 Speaker 1
Also, I love Stewart Hall and Stewart Halls, like theories of like, you know that it.
01:11:00
Yeah, totally.
01:11:03 Speaker 1
Is the most powerful way to sort of control any kind of narrative or belief is when you start having people like experts start to take this in and it is like this framework of like repeating it and believing it and that it’s not like you don’t need the Monsanto to be making this argument when you have 100,000 other people it it becomes common sense like you know that this thing and it also like I was thinking about well one small thing was it was I was chatting about this with my husband this morning after we finally.
01:11:28 Speaker 2
Yeah, absolutely.
01:11:37 Speaker 1
Got our kid to school only 5 minutes.
01:11:39 Speaker 1
Really.
01:11:39 Speaker 1
And I was trying to say like, OK, this was my OHP, my God, it was amazing.
01:11:41 Speaker 2
It’s good.
01:11:43 Speaker 1
It was like but one of the strategies that I used was to use timer and it was like cause this idea I think where I was like OK I’m setting a timer for 15 minutes and then in 15 minutes it’s going to be time to go and they were like oh OK like it it took more than that.
01:11:57 Speaker 1
Like it wasn’t that magical, but I was just saying it’s like, so funny that there’s this kind of, like, objectivity in the data of the like clock.
01:12:04 Speaker 1
You know, in this clock where it’s like.
01:12:05 Speaker 1
You know, I was talking about like sometimes students will be like, well, the PowerPoint slide said whatever.
01:12:10 Speaker 1
And I’m like, yeah, who the **** do you think made that PowerPoint slide like that is me.
01:12:13 Speaker 2
I.
01:12:14 Speaker 1
I made that like that.
01:12:15 Speaker 1
The PowerPoint is me like there’s this sort of like, a really clear rubric.
01:12:18 Speaker 1
Like I often will make really clear rubrics for my students, and it’s like I’m the one.
01:12:22 Speaker 1
If you look at the same grade as if I just give you a.
01:12:24 Speaker 1
Paragraph.
01:12:25 Speaker 1
Strength, but it’s like there’s this idea of, like, well, they can calculate the numbers.
01:12:29 Speaker 1
And this was this percent and that and it’s like, don’t you realize that I actually am, like, futzing with it to make it be the bee that I was going to give you?
01:12:35 Speaker 2
Absolute sing.
01:12:36 Speaker 1
No matter what.
01:12:36 Speaker 2
So that is not interesting.
01:12:37 Speaker 1
You know, but it’s it.
01:12:38 Speaker 2
Absolutely.
01:12:39 Speaker 2
Like absolutely.
01:12:40 Speaker 1
It’s like this idea of objectivity through these kinds of forms.
01:12:43 Speaker 1
Of data.
01:12:44 Speaker 1
That just aren’t real. Yeah.
01:12:46 Speaker 2
So that’s exactly it.
01:12:48 Speaker 2
So in the book in Chapter 5 in particular, I draw on kind of a history of scholarship on we might call it the politics of numbers or particular understandings of objectivity that then map on to write like positivistic frameworks and and numerics and.
01:12:55 Speaker 1
OK.
01:13:06 Speaker 2
And to to try to make sense of the omnipresence, as you say, of this framework, you don’t.
01:13:13 Speaker 2
It’s not just greenwashing or a particular message coming from Monsanto, but that’s in the book I ethnographically trace it from oops corporate ads all the way to the farm right when farmers are saying I’m doing this because this is the future of farming, and it was so interesting.
01:13:15 Speaker 1
No.
01:13:25 Speaker 1
Yeah.
01:13:26 Speaker 2
Farmers would say to me like I would say, well, what have you learned from this field mapping and this AI?
01:13:32 Speaker 2
Oh well, nothing like I I knew that I knew that part of my field was productive.
01:13:37 Speaker 2
I.
01:13:39 Speaker 2
Yeah, but but now I know.
01:13:42 Speaker 2
Yeah, they would say I know it now or you know, but then there was also a way that like the sort of systems for example around crop insurance were being designed with this Immaculate Conception of data.
01:13:44 Speaker 2
Yeah.
01:13:53
I don’t know.
01:13:54 Speaker 2
This presupposition, so farmers right, would submit all sorts of qualitative data or Excel spreadsheet and.
01:14:00 Speaker 2
There’s this one Canadian government egg impact climate reporter.
01:14:04 Speaker 2
And increasingly, farmers had to go through this algorithmic way of knowing right in order to access things like particular insurance provisions and.
01:14:13 Speaker 2
And so yeah, it’s about the omnipresence of that framework and the politics of it.
01:14:17 Speaker 2
So yeah, I tried to draw on, you know, other people in data studies and in a way, the book is very much in conversation.
01:14:23 Speaker 2
With really well known folks like Kate Crawford and Ian Bogost, who had talked about a kind of religiosity toward algorithms.
01:14:31 Speaker 1
Hmm.
01:14:32 Speaker 2
Um, but I found that work very descriptive.
01:14:37 Speaker 2
Like just sort of pointing out, you know, there’s this religious view people have about AI and and and describing it as a kind of epistemic distortion, almost like, you know, people are not looking at things correctly.
01:14:51 Speaker 2
Three and instead I draw a history of science study scholarship like Teddy Porter’s trust in numbers, which is exactly what you describe about your students and the, you know, numeric calculation and the right this kind of particular interpretation of objectivity, coming from numbers per se.
01:15:08 Speaker 2
Yeah, I draw on that scholarship to try to explain or analyze why, like, why the omnipresence of this of this framework, and in order to like, try to show through a lot of stories.
01:15:13 Speaker 1
Yeah, I don’t. Yeah.
01:15:21 Speaker 2
And again, that kind of ethnographic intervention.
01:15:23 Speaker 2
And how this way of talking about data and algorithms is really useful.
01:15:30 Speaker 2
It’s rhetoric.
01:15:30 Speaker 2
It’s really powerful, right?
01:15:32 Speaker 2
Like if you can do this, I could trace right then.
01:15:36 Speaker 2
Oh, look at this.
01:15:36 Speaker 2
Supercluster got funded, right?
01:15:37 Speaker 1
Yeah.
01:15:38 Speaker 2
And this was their.
01:15:38 Speaker 2
That showed literally this completely like non imperical as a non evidence bag.
01:15:45 Speaker 2
Best graphic in this bid for major government funding between the beginning of agriculture and asymptotic line to AGG 4.0, and that was the history of AI and agriculture.
01:15:58 Speaker 1
Yeah.
01:15:58 Speaker 2
Like what does that mean even?
01:16:00 Speaker 1
Yeah.
01:16:00 Speaker 2
Yeah, but, but if you could make this claim that, you know, we’re going toward this perfect future.
01:16:06 Speaker 2
Yeah, with no environmental impact.
01:16:09 Speaker 1
Yeah.
01:16:09 Speaker 2
No, we’re all farmers.
01:16:11 Speaker 2
Make all the money and consumers are happy and there’s no food waste and there’s traceability and so therefore there’s, like we don’t, there’s no disease and.
01:16:19 Speaker 2
Isn’t this good?
01:16:20 Speaker 2
The
future and it’s data and AI that are going to get us there if you can, if you can, it’s back to you.
01:16:21
Yes.
01:16:24 Speaker 1
Yeah.
01:16:27 Speaker 2
I’m.
01:16:28 Speaker 2
What did you say?
01:16:29 Speaker 2
The beginning dictate no totalitarians right back to right.
01:16:30 Speaker 1
Oht totalitarian.
01:16:33 Speaker 1
Yeah, yeah, yeah, totally.
01:16:34 Speaker 2
It’s a.
01:16:34 Speaker 2
It’s a great messaging strategy and so that’s what I I tried to explain in the book.
01:16:37 Speaker 1
And it works and I’ll just say because I like to pretend like I’m an expert on everything.
01:16:41 Speaker 1
And so because.
01:16:42 Speaker 2
You don’t. You’re no, you’re bottling humility.
01:16:43 Speaker 2
No.
01:16:44 Speaker 1
But I do I try but when I ripped before summer 4, eighth grade grade, my family moved from Connecticut to Illinois, and I lived in Lake West, Central Illinois, which is very viral farming comma.
01:16:54 Speaker 2
Farming.
01:16:56 Speaker 1
Unity and like we would have to drive an hour and a half to get to an airport, and if you actually go to the Quad Cities.
01:17:03 Speaker 1
So I think it’s Moline is technically, there’s like 4 cities.
01:17:07 Speaker 1
That’s right.
01:17:09 Speaker 1
Along the Mississippi River between Iowa and Illinois.
01:17:12 Speaker 1
And I did my grad degree in Iowa.
01:17:14 Speaker 1
Also, very farming place and if you go to the airport, it’s like all John Deere like there’s like cause I would fly with like my kids and so there’d be like play things that were like these, like foamy things.
01:17:19 Speaker 2
Play.
01:17:24 Speaker 1
And at that Coralville mall, we take our kids there.
01:17:26 Speaker 1
And it was like corn and tractors was like like this is like very rural comma.
01:17:29 Speaker 1
Cities and I remember having this conversation.
01:17:33 Speaker 1
This is a long time ago, but the husband of one of my mom’s best friends is a farmer.
01:17:38 Speaker 1
His name is Ned, and he’s a lovely person and his like has three daughters who are brilliant and all went on to do brilliant things.
01:17:39
OK.
01:17:45 Speaker 1
Um and but he’s, you know, one of the few.
01:17:47 Speaker 1
Like she’s got a big firm and it’s corn and soy like, this is everything there.
01:17:52 Speaker 1
And I remember saying.
01:17:53 Speaker 1
Once in, like the early 2000s, I was like, you know, but isn’t Monsanto bad?
01:17:58 Speaker 1
And he was really defensive and like, like ohf, what, like, are you gonna like in the Commons?
01:18:05 Speaker 1
You’re gonna, like go to.
01:18:06 Speaker 1
Go to, go to somewhere his daughter had been in the Peace Corps.
01:18:09 Speaker 1
I think he’s like, you know, whichever country was never.
01:18:11 Speaker 1
He’s like, go there and see how the Commons is doing for you.
01:18:14 Speaker 1
And he was like, you know, they like it cost something.
01:18:17 Speaker 2
The comments, as in communism?
01:18:18 Speaker 1
Like it’s like, yeah, like as cause he was like basically saying, like, you’re and he’s a progressive, like he’s not conservative. Like, this isn’t like he’s a very intelligent.
01:18:23 Speaker 2
Yeah, yeah, yeah.
01:18:28 Speaker 1
Like thoughtful person, but very, you know, he’s a farmer, like from generations of farmer and one of his daughters actually went like, she’s a farmer.
01:18:36 Speaker 1
She went to Cornell and she does organic farming now, so different from her father.
01:18:40 Speaker 1
But like and, they had a baby named Buck, which I think is the coolest name for overnight.
01:18:44 Speaker 2
Durable.
01:18:44 Speaker 2
So was Ned.
01:18:45 Speaker 1
I know, I think.
01:18:45 Speaker 2
Would you told me about Ned?
01:18:47 Speaker 1
I think we talked about him once before.
01:18:47 Speaker 2
I don’t talk.
01:18:49 Speaker 1
Yeah.
01:18:49 Speaker 1
Yeah, Ned, the pharmacy.
01:18:50 Speaker 1
I’m like, here’s my one farmer story anyway, but I still found it really interesting and especially how you talk and also I follow on TikTok or somewhere there’s this.
01:18:58 Speaker 1
Farmer, who I really enjoy.
01:19:00 Speaker 1
He’s a cow farmer, I think in Iowa, but his whole thing is like trying to show how his cows are treated, how and and exactly but in a very like, look how nice they are.
01:19:08 Speaker 2
Have a sausage just made.
01:19:12 Speaker 1
But so much of it is look at the technology.
01:19:15 Speaker 1
Look at how can so neg’s point was I have a better crop yield because of Monsanto products and I’m gonna pay for that.
01:19:15 Speaker 2
Yeah.
01:19:22 Speaker 1
Like I’m not gonna use like old seeds that like have more disease.
01:19:25 Speaker 1
He’s like, you know, like it was a very pragmatic.
01:19:28 Speaker 1
And I was like, shut down and I didn’t.
01:19:30 Speaker 1
I was like, not enough end of an expert at anything, and I was like, OK, and then I’m gonna go get a drink, but and then?
01:19:36 Speaker 1
But this other guy that I follow on. Tik.
01:19:37 Speaker 1
Tuck.
01:19:38 Speaker 1
It’s like it’s exactly what you’re describing.
01:19:41 Speaker 1
Is this like an?
01:19:42 Speaker 1
Again, it’s not that farmers are ignorant, it’s that they’re actually smart in a certain sense of like, I mean, they are smart.
01:19:48
OHS, yeah.
01:19:50 Speaker 2
Super rational actor.
01:19:52 Speaker 2
Not to laughs.
01:19:52 Speaker 1
Rational choice theory.
01:19:53 Speaker 1
That is the theory.
01:19:54 Speaker 1
And.
01:19:55 Speaker 1
Remember, before rational choice theory so that, but like you can just like everything that you’re saying.
01:19:55 Speaker 2
Yeah.
01:19:56 Speaker 2
OK, got it.
01:20:00 Speaker 1
It’s like, of course, like absolutely, you’re gonna do this unless you have Someone Like You coming along and being, like, can we think about this for a second?
01:20:07 Speaker 2
Of them, I mean it’s very complex.
01:20:09 Speaker 2
First of all, I would lead by saying all farmers are so hard work.
01:20:13 Speaker 1
Yes.
01:20:13 Speaker 2
And like, even though I’m not myself connected to a farm, my grandma grew up on a farm in Saskatchewan.
01:20:19 Speaker 2
But I do see that all farmers, regardless of size and equipment.
01:20:24 Speaker 2
Super, super hard.
01:20:25 Speaker 2
Working thankless job.
01:20:27 Speaker 2
Thank goodness we still have people who are willing to grow food because most of them don’t make a livable wage, right?
01:20:33 Speaker 2
Most farms in Canada are supported by off farm income, which is just to me, like if you’re interested in injustice or environment, you talk about farming.
01:20:40 Speaker 1
Right.
01:20:41 Speaker 1
Oh my God, I have seen have heard looked at.
01:20:42 Speaker 1
Like sent.
01:20:43 Speaker 1
Maybe it was in your book I read.
01:20:45 Speaker 1
Like, but I remember just seeing as the statistics might have been teaching.
01:20:49 Speaker 1
But like the demographic data in the senses of like the percentage of the population in like 1900 that were farmers that that was their occupation.
01:20:53 Speaker 2
In the next two to Agrapha.
01:20:54 Speaker 2
Oh, in farming.
01:20:57 Speaker 1
And then if you look like 1999, it’s like it was like 2% or something in the United States of people say that their career is a farmer, whereas it was like, I don’t know, 65 percent, 70%.
01:21:01 Speaker 2
Yeah.
01:21:06 Speaker 1
It’s like less than one.
01:21:07 Speaker 1
Like that is major social transformation over, you know, and it’s.
01:21:11 Speaker 2
OK, the lively back to the those rural communities and yeah, it’s a.
01:21:15 Speaker 1
Bright and so also like the knowledge collection.
01:21:17 Speaker 1
It’s like you don’t have neighbors anymore.
01:21:18 Speaker 1
Of course, you’re gonna rely on like this company coming and be like, well, we collected data from around the world and we have calculated this algorithm and we can tell you here’s the.
01:21:28 Speaker 1
Best way to do it?
01:21:28 Speaker 2
We did specific to your.
01:21:29 Speaker 1
Right.
01:21:30 Speaker 2
It’s like precision medicine that at least that’s the message.
01:21:33 Speaker 1
Yeah.
01:21:33 Speaker 2
Again, no one has really validated.
01:21:37 Speaker 2
Whether this is a good tool in terms of productivity gain or specifically the environmental impacts cause that’s part of the message is that you know through these kind of precise AI driven interventions, farmers will make more judicious use of things like chemicals and water.
01:21:55 Speaker 2
And heck, I’m not against that.
01:21:56 Speaker 2
That’s.
01:21:57 Speaker 2
Alright.
01:21:57 Speaker 2
So I’m not like I will leave it saying, hey, all farmers are hard working and thank God for farmers, but also I’m not against the technology.
01:22:06 Speaker 2
I’m just.
01:22:07 Speaker 2
Yeah, you know, wanting to look carefully in my Social Research at who controls the technologies.
01:22:16 Speaker 2
Right.
01:22:17 Speaker 2
Ask and then ask precise questions like who stands to gain the most right and and even if farmers gain a little bit is that gain, can it be compared?
01:22:26 Speaker 2
It’s like our platform used versus right, this sort of massive economic gains in the uses and reuses of data by the platform companies, right?
01:22:33 Speaker 1
Yes.
01:22:34 Speaker 2
If you think about the holdings of, say, Facebook Corporation or Google, Apple, Amazon, Facebook, right.
01:22:40 Speaker 2
And so it’s just about, it’s about thinking about power.
01:22:42 Speaker 1
Yeah, it’s like you were just asking what?
01:22:45 Speaker 1
Are we doing?
01:22:45 Speaker 1
Yeah, just like what it is the.
01:22:47 Speaker 2
Yeah.
01:22:47 Speaker 1
The general’s question it’s like, yeah.
01:22:48 Speaker 2
Who’s who’s really benefiting, and should we set some parameters?
01:22:52 Speaker 2
Like should farmers pay for this advice or because they effectively pay with their data?
01:22:57 Speaker 2
Should they get the advice for free?
01:22:58 Speaker 1
Yeah, no kidding.
01:22:59 Speaker 2
Or should we prevent these companies from being able to transfer data among the companies, or to use the data to sell the data sets to insurance or reinsurance companies that literally?
01:23:09 Speaker 2
We stand to profit off of farmer loss should we be able to use these data to predict areas of chemical need and set prices.
01:23:17 Speaker 2
Because these companies are so big in terms of their market share, they’re oligopolies.
01:23:23 Speaker 2
You know, all along the food chain in agriculture, there’s a small handful of companies who said everyone is a John Deere person, right?
01:23:29 Speaker 1
Yeah, yeah.
01:23:31 Speaker 2
And that’s just kidding.
01:23:32 Speaker 1
**** about.
01:23:33 Speaker 1
I don’t know the other.
01:23:33 Speaker 2
Yeah, I think about it one.
01:23:34 Speaker 1
I don’t even know what the other companies.
01:23:35 Speaker 2
Yeah, but there are some small handful of machinery companies.
01:23:39 Speaker 2
Yeah, small handful of companies, you know, four companies now control the entire global market for seeds.
01:23:46 Speaker 1
Wow, that’s unbelievable.
01:23:47 Speaker 2
Unbelievable.
01:23:49 Speaker 2
So you know, what does that do to choice for farmers?
01:23:53 Speaker 2
Yeah, for consumers, what does that mean in terms of like how?
01:23:58 Speaker 2
Decisions are made in Ottawa or Washington.
01:23:58 Speaker 1
Yeah.
01:23:59 Speaker 1
Accountability.
01:24:00 Speaker 1
Like, totally it’s like.
01:24:01 Speaker 2
Yeah.
01:24:02 Speaker 2
Like the lobbying the so it’s it’s really just about power.
01:24:06 Speaker 2
I mean, it’s power all the way down for me, but totally.
01:24:09 Speaker 2
Yeah.
01:24:10 Speaker 2
And so for that farm, for Ned, for Ned, who’s you know, that’s just that part of it too is the messaging, right.
01:24:13 Speaker 1
If.
01:24:13 Speaker 1
These retired now but.
01:24:16 Speaker 2
And and that’s one way that power operates, and it’s a way that’s centrally interests me.
01:24:21 Speaker 2
Obviously I’ve I’ve made that claim.
01:24:22 Speaker 2
I think really clear through this pod cast is the use of rhetoric and language and how power gets set through right message.
01:24:29 Speaker 1
Yeah.
01:24:31 Speaker 2
And so farmers are told in so many ways, but including in corporate advertising, their AG, expert advisor, right, their service provider, government advice that you gotta get big or go home.
01:24:45 Speaker 2
That was the famous language of Earl Butts, the the governor in the US, who was really responsible for the beginning of, I would say, the take off of of Bitcoin and big soy in the US.
01:24:55 Speaker 2
And I have an amazing colleague, Jennifer Clapp, who’s a CRC tier one, who does, who has looked at the history of basically political tools like subsidies in the US and how, you know, US food aid and US policies internationally and also domestic.
01:25:08
Yeah.
01:25:13 Speaker 2
Where used to basically set a particular vision for agriculture.
01:25:17 Speaker 2
We might call it industrial, the kind of Ned, right. Bitcoin. Yeah.
01:25:20 Speaker 2
Big, big soy in order to establish dominance for the US and the post war period.
01:25:25 Speaker 1
Yeah.
01:25:25 Speaker 2
So it’s all wrapped up in Empire building basically, but basically that was the messaging and has been the messaging coming from corporations and a bazillion different ways for farmers.
01:25:28 Speaker 1
Oh my God, yes.
01:25:34 Speaker 2
You know, farming doesn’t, there’s not really money in primary production unless you’re one of the companies selling tractors and seeds.
01:25:41 Speaker 2
And and farmers are told the only way to make money is to get bigger, to buy the newest technology, usually by taking on more debt.
01:25:49 Speaker 2
Right, right.
01:25:49 Speaker 2
And therefore you can make a bit more money.
01:25:52 Speaker 2
You can outcompete your neighbors, right?
01:25:53 Speaker 2
What is that?
01:25:54 Speaker 2
The law of marginal returns you just, you know, tiny little margins, um, and then you buy up your neighboring farmers that go out of business.
01:26:02 Speaker 2
And but it’s.
01:26:03 Speaker 2
Yeah, it’s a tough gig.
01:26:04 Speaker 2
It’s not the way to make real money.
01:26:04 Speaker 1
It’s like monopoly, yeah.
01:26:05 Speaker 2
The way to make real money, there’s an interesting report that I analyze in the book on data in an egg.
01:26:11 Speaker 2
It’s from Goldman Sachs, and they’re basically radically honest.
01:26:15 Speaker 2
I mean, it’s for investors.
01:26:16 Speaker 1
Yeah.
01:26:16 Speaker 2
Sounds.
01:26:17 Speaker 2
Like, of course, they’re honest, right?
01:26:18 Speaker 2
It’s not for critical scientists to look at.
01:26:21 Speaker 2
This was messaging for investors investing in so-called precision agriculture and the Goldman Sachs report says something like in a gold rush sell shovels like you gotta sell.
01:26:31 Speaker 2
The thing that’s collecting the data or you sell the algorithm that’s giving the advice, but you’re not.
01:26:36 Speaker 2
You’re not the person shoveling the gold, oht.
01:26:38 Speaker 1
My God, that is so brilliant.
01:26:40 Speaker 1
That is so, although the evil part of me is like, how can I parlay that into my new brand, but?
01:26:49 Speaker 2
Well, I don’t know.
01:26:49 Speaker 1
But.
01:26:50 Speaker 2
Yeah, you’re in your bid for it becoming an influencer.
01:26:53 Speaker 2
I don’t know.
01:26:53 Speaker 2
Yeah, maybe we need to turn the tools against the powerful.
01:26:57 Speaker 1
Yeah.
01:26:57 Speaker 2
Is there a way to use the tools for?
01:27:00 Speaker 1
Yeah, yeah.
01:27:01 Speaker 1
I don’t know.
01:27:01 Speaker 1
I’ll have to think about it.
01:27:02 Speaker 1
Is we should probably end here and is there anything else you wanna add?
01:27:06 Speaker 1
Just I could talk to you all day and I’m.
01:27:09 Speaker 1
I definitely have to get back to.
01:27:10 Speaker 1
I started reading your book and then got distracted by 100, but I wanna assign it.
01:27:14 Speaker 1
Also, I just wanna say in my.
01:27:15 Speaker 1
Like grad quantitative courses cause like I do, I’m always when I’m teaching statistics well and I’m always trying to explain.
01:27:19 Speaker 2
Call it the September is.
01:27:20 Speaker 2
Yeah.
01:27:22 Speaker 1
It’s like everything is a story.
01:27:24 Speaker 1
I was like what you are trying to do with this.
01:27:25 Speaker 1
It’s like these data aren’t true.
01:27:27 Speaker 1
These data are they’re just a particular form of evidence that allows you to tell a story, and it’s like people don’t.
01:27:35 Speaker 1
And I that to me, because most people aren’t going to end up using statistics.
01:27:38 Speaker 1
Most people are.
01:27:39 Speaker 1
It’s like they’ll run a regression analysis in my class and they’re never going to do it again.
01:27:42 Speaker 1
There’ll be a few people who will, but I want people to understand like the limits of quantitative analysis, data analysis, more than anything else and it’s.
01:27:50 Speaker 2
You’re such a good teacher.
01:27:52 Speaker 2
That’s amazing.
01:27:52 Speaker 1
Try what’s.
01:27:53 Speaker 2
I haven’t thought critically as in like analytically or carefully about how the sausage is made in the sciences.
01:27:58 Speaker 2
Until I started my masters in the sociology of technology with students in the social sciences who had thought so much more about the.
01:28:04 Speaker 2
Sciences than I had as a scientist.
01:28:06 Speaker 1
No, but it’s like, well, I think.
01:28:07 Speaker 2
Also, like so, it’s great you’re giving that to your students.
01:28:09 Speaker 1
Yeah, but it’s hard too, because it’s who valued, you know, it’s like I got jobs because I teach stats.
01:28:16 Speaker 1
It’s like nobody wants to teach it, but everybody values it, and so it’s like I can teach it and you know, but it it doesn’t.
01:28:23 Speaker 1
But I I grow more and more disillusioned with it because I’ll see times.
01:28:28 Speaker 1
There like.
01:28:29 Speaker 1
You know, like my dissertation, which I don’t want anyone to ever look at.
01:28:32 Speaker 1
Which now people will look at, but like it’s like it was.
01:28:35 Speaker 1
So like I started with a proposal and then I got a grant and then I was invested in it.
01:28:40 Speaker 1
By the time I realized that there wasn’t really any story to tell what the data that I had.
01:28:45 Speaker 1
And so I had to make a story.
01:28:46 Speaker 2
Call.
01:28:47 Speaker 2
Yeah, yeah.
01:28:47 Speaker 1
You know, it’s like I had to do it.
01:28:48 Speaker 1
And so it’s like, well, let’s and there’s like, I didn’t do like horrible data mining.
01:28:52 Speaker 1
And I believe in ethical like I do believe that.
01:28:56 Speaker 1
It’s like I believe in letting the data tell the story while also recognizing there’s a person who collected these data.
01:29:02
Yeah.
01:29:02 Speaker 1
There’s or like realize it.
01:29:03 Speaker 2
Or there’s more than one way to.
01:29:04 Speaker 2
Tell this story.
01:29:05 Speaker 2
Right.
01:29:05 Speaker 2
And there’s more than one way to collect the data on any particular variable, which is what you.
01:29:07 Speaker 1
Yes, yes.
01:29:09 Speaker 1
Or that the variables aren’t there?
01:29:11 Speaker 1
It’s like I tried.
01:29:11 Speaker 1
Like I like.
01:29:12 Speaker 1
So I got into qualitative research because nobody was asking the questions that I was asking because nobody thought like breastfeeding and work.
01:29:13 Speaker 2
With this ring, yeah.
01:29:20 Speaker 1
What does that have to do with anything?
01:29:21 Speaker 1
And it’s like any ******* mother who had a baby like you were talking before.
01:29:24 Speaker 1
It’s like pumping your milk on the train it.
01:29:26 Speaker 1
Like, obviously there’s a huge connection between breastfeeding and work, but everything about breastfeeding was just like, oh, this is best.
01:29:32 Speaker 1
This is good for your baby and I’m like, why is nobody thinking about what this is like for the person actually providing this, like, saving elixir of, like world like, you know, domination or whatever.
01:29:37
Yeah.
01:29:41
What?
01:29:43 Speaker 2
You know, it’s so interesting.
01:29:44 Speaker 2
Yeah, absolutely.
01:29:46 Speaker 2
I love that, yeah.
01:29:47 Speaker 2
So in the in the book, I’m quite careful, I think to, you know, to talk not about data bias and cause that stuff doesn’t concern me.
01:29:55 Speaker 1
Yeah. Yes.
01:29:55 Speaker 2
It’s partiality.
01:29:57 Speaker 2
It’s like, you know, on any given variable or curiosity about the world, someone decides.
01:30:03 Speaker 2
Well, first of all, what variables are of interest?
01:30:06 Speaker 2
And then and then how to collect those data?
01:30:09 Speaker 2
How to structure them?
01:30:10 Speaker 2
How to weight them?
01:30:11 Speaker 2
They write the algorithm so there’s human decision making there at every step of the way.
01:30:15 Speaker 2
And then there’s how to tell the story from the data, or from the insight.
01:30:17 Speaker 1
Yeah, yeah.
01:30:20 Speaker 2
Right.
01:30:21 Speaker 2
And so it’s.
01:30:22 Speaker 2
Yeah, just drawing attention to those kinds of things.
01:30:24 Speaker 2
And you’re right.
01:30:24 Speaker 2
That’s what social scientists I think do, especially of the qualitative ilke, made its thinking carefully about also fit for purpose.
01:30:32 Speaker 2
Like 1 project that I’ve done with the Chief Scientist Office with a grad student of mine was designing a tool decision tool really just I mean an Excel spreadsheet that would allow bureaucrats to assess the credibility of evidence coming in for impact assessments.
01:30:52 Speaker 2
These are like large environmental assessments of big development projects, but on social.
01:30:52 Speaker 1
Oh my God.
01:30:56 Speaker 2
Effects because there’s this lack of knowledge about well, first of all, that, like data collected from a public roundtable, are data.
01:31:04 Speaker 2
But also then how do we write?
01:31:06 Speaker 2
What method should we use to collect data on, say, perceptions of cultural artifacts that might be harmed by a development project?
01:31:12
OK.
01:31:13 Speaker 2
Could you qualitative wow.
01:31:13
Could.
01:31:15 Speaker 1
I it and I.
01:31:15 Speaker 2
Could you share with you that and?
01:31:16 Speaker 1
Could share it on the website.
01:31:17 Speaker 2
It’s like all sorts of, you know, just thinking really carefully.
01:31:20 Speaker 2
About in in many cases it ought to be qualitative methods that are used, and then of course, there are ways to judge the rigor or I would say the credibility, the validity of data collected qualitatively.
01:31:33 Speaker 1
Yes.
01:31:33 Speaker 2
It’s not like it’s all opinion or nothing, right?
01:31:37 Speaker 2
You know it’s or.
01:31:38 Speaker 2
It’s all opinions, all relative, but but there’s not a lot of knowledge about.
01:31:42 Speaker 2
How to how to make those assessments well?
01:31:45 Speaker 1
And it’s also, I love it.
01:31:46 Speaker 1
I was like, I wanna see it also for myself, cause there’s part of it where it’s like.
01:31:49 Speaker 1
I I kind of intuitively know how to.
01:31:51 Speaker 1
But how do we do this like it’s?
01:31:51 Speaker 2
It’s like.
01:31:51 Speaker 2
You.
01:31:52 Speaker 2
And that was part of the fun.
01:31:53 Speaker 2
It was like trying to think through exactly systematically, how to assess.
01:31:57 Speaker 1
Yeah, yeah.
01:31:58 Speaker 1
How do I assess these claims?
01:31:58 Speaker 2
I’m regular.
01:32:00 Speaker 1
You know, it’s like, how do I know?
01:32:02 Speaker 1
Cause it’s all I’ll just say like with my kids, it’s like they sent me this one video of this guy that they really like a you tuber and they’re like he collects so much evidence.
01:32:09 Speaker 1
He has so much data, all the stuff and I’m like, yeah, but he’s making this causal claim that it’s it’s it.
01:32:15 Speaker 1
Like there could be 100 other causes.
01:32:17 Speaker 1
It’s like he’s saying well, like.
01:32:19 Speaker 1
Has event a happened?
01:32:21 Speaker 1
And yes, event B happened, but he’s saying B happened because of A and I’m like you can’t.
01:32:24 Speaker 2
I have.
01:32:26 Speaker 1
He doesn’t have, like, that’s the piece that he doesn’t have and it’s just I think those kinds of really it’s like why and like why make the podcast about like the social sciences in the humanities?
01:32:28
Yeah.
01:32:35 Speaker 1
Because I think it also is.
01:32:36 Speaker 1
It’s like the importance of like philosophy and you know, like media studies and rhetoric and all these different fields that I’m talking to people from of like.
01:32:46 Speaker 1
Like research is about thinking.
01:32:49 Speaker 1
It’s like it’s not just about like producing a report with a bunch of numbers on it.
01:32:53 Speaker 2
Yeah, you know.
01:32:54 Speaker 2
Yeah, evidence is comes with in a number of different ways.
01:32:58 Speaker 1
Yeah, yeah, I love it.
01:32:59 Speaker 1
I love it.
01:33:00 Speaker 1
Oh my God.
01:33:00 Speaker 1
And we got it and cause OK forever.
01:33:01
Like a time.
01:33:02 Speaker 1
Thank you, Phyllis.
01:33:03 Speaker 1
Thank you.
01:33:04 Speaker 2
I’ve then a boatload of fun and it’s no clearer so the sky.
01:33:07 Speaker 1
I know no.
01:33:08 Speaker 2
I mean oht.
01:33:08
Oh yeah.
01:33:09 Speaker 2
Our thinking maybe.
01:33:09 Speaker 1
Oh my.
01:33:10 Speaker 1
But.
01:33:10 Speaker 1
It’s white Shelly way worse.
01:33:12 Speaker 1
The window it’s.
01:33:13 Speaker 2
Like Penulis hasn’t turned around in this conversation.
01:33:14 Speaker 1
Sleep ohm.
01:33:15 Speaker 2
It’s a wall.
01:33:17 Speaker 2
My God, it just don’t know on it at home I’m walking.
01:33:18 Speaker 2
We’re gonna see.
01:33:19 Speaker 1
I know I walked oht good.
01:33:21
Me too.
01:33:21 Speaker 1
OK.
01:33:22 Speaker 1
So that’s that’ll be good.
01:33:23 Speaker 1
We’ll be able to see like and maybe it’s, I don’t know.
01:33:26 Speaker 1
I want some like metaphor of like sometimes it’s hard to see the big picture but all we can do is take one one step at a time.
01:33:32 Speaker 1
One day at.
01:33:32 Speaker 1
A time suggests, yeah, 1 foot at a time.
01:33:34 Speaker 1
Know we can only do it.
01:33:35 Speaker 1
We.
01:33:36 Speaker 1
And that’s right.
01:33:36 Speaker 1
That’s right.
01:33:37 Speaker 2
Alright, well, humility and taking one step at a time.
01:33:40 Speaker 1
Exactly.
01:33:40 Speaker 2
I’m gonna hold on to those.
01:33:41 Speaker 1
I love it.
01:33:41 Speaker 1
I.
01:33:42 Speaker 1
It.
01:33:42 Speaker 1
Oh my God, it’s so good thing.
01:33:43 Speaker 1
You so much.
01:33:44 Speaker 1
This is so I’m so glad that you came here and.
01:33:47 Speaker 1
And yeah, if people want to know more about your work, where should I send them?
01:33:51 Speaker 2
To guess my website.
01:33:51 Speaker 1
You everybody’s is.
01:33:52 Speaker 2
It’s woefully out of date, but yeah.
01:33:55 Speaker 1
That’s OK.
01:33:56 Speaker 1
Is it Kelly Bronson?
01:33:58 Speaker 1
What is your website?
01:33:59 Speaker 2
You know what?
01:33:59 Speaker 2
It’s the science and society collective.
01:34:03 Speaker 2
I made a distinct or a A and it on purpose proposal.
01:34:08 Speaker 2
What’s the word I’m looking for?
01:34:09 Speaker 2
Decision to not.
01:34:11 Speaker 2
That call it a lab.
01:34:13 Speaker 2
I don’t want to emulate the wet model, which is a whole other conversation.
01:34:16 Speaker 2
We could have.
01:34:16 Speaker 2
There’s a lot of pressure on me when I got the Canada Research chair to apply for Canadian Foundation for innovation, funding and infrastructure funding and to have a lab, and most people call it a lab.
01:34:25 Speaker 2
And you know the Bronson lab and I didn’t want to do that.
01:34:29 Speaker 2
So.
01:34:29 Speaker 1
Missy called the collective.
01:34:29 Speaker 2
So we’re.
01:34:30 Speaker 1
Oh my God, I love that so much.
01:34:32 Speaker 1
Oh my God, I love it.
01:34:32 Speaker 2
It’s a bit where well, but that’s OK.
01:34:33 Speaker 1
It it but, but that’s what science should be like to me.
01:34:36 Speaker 1
Science shouldn’t be allowed.
01:34:37 Speaker 1
It should be a collective.
01:34:38 Speaker 1
It’s a collective enterprise.
01:34:38 Speaker 2
And it is like I would be nothing without all the people around me.
01:34:39 Speaker 1
It’s about right.
01:34:41 Speaker 1
I love it.
01:34:42 Speaker 1
I love it.
01:34:42 Speaker 1
OK.
01:34:43 Speaker 1
Yeah, to me.
01:34:44 Speaker 1
Yeah, that’s not.
01:34:44 Speaker 1
Like communism, that is.
01:34:47 Speaker 1
That’s that’s good science anyway.
01:34:47 Speaker 2
It’s.
01:34:49 Speaker 1
Alright.
01:34:49 Speaker 2
And that that.
01:34:50 Speaker 2
Yeah, my approach to the Academy, but that’s OK.
01:34:51
Fair enough.
01:34:53 Speaker 1
They’re enough.
01:34:54 Speaker 1
It’s the best part of it.
01:34:55 Speaker 1
Anyway.
01:34:56 Speaker 1
Well, thank you, Kelly, and thank you to those of you listening to the doing Social Research podcast.
01:35:02 Speaker 1
If you enjoyed this episode, please take a moment to give us a rating on your favorite podcast platform and share what you liked about it.
01:35:09 Speaker 1
And if you did not enjoy this episode, no one wants to hear from.
01:35:13 Speaker 1
This this will really help us to reach more listeners and make doing Social Research within the reach of everyone.
01:35:19 Speaker 1
I’d love to connect with you.
01:35:20 Speaker 1
You can follow me on like all the social media things except what’s the one I don’t use Snapchat.
01:35:26 Speaker 1
I’m too old for that, but Instagram, Twitter and I wanna get Sky blue is like the new one.
01:35:32 Speaker 1
I hate X and I hate Elon Musk well.
01:35:35 Speaker 1
I’m like, yeah.
01:35:36 Speaker 1
So anyway, I need to find you anyway.
01:35:37 Speaker 1
I’m at socio Mama.
01:35:39 Speaker 1
I also have a Facebook group which do the doing Social Research.
01:35:43 Speaker 1
Facebook group to keep the conversation going a little last count there was like 4 people joined.
01:35:47 Speaker 1
So thank you to my three friends because one of those.
01:35:50 Speaker 1
Me.
01:35:52 Speaker 2
I will join you. Yep.
01:35:53 Speaker 1
Thank you.
01:35:54 Speaker 1
So links to all my social media to references mentioned in today’s episode will be in the show notes.
01:36:00 Speaker 1
If you have a question about Social Research you’d like me to tackle on the podcast or make a post about on the website, you can send me an e-mail at [email protected] or message me through the social media.
01:36:12 Speaker 1
LinkedIn is another one.
01:36:14 Speaker 1
And don’t forget to check out the website doingsocialresearch.com.
01:36:17 Speaker 1
It’s still like kind of a mess in a work in progress, but I am hoping to keep it tidy tidied up and I always want to add more information to it.
01:36:25 Speaker 1
Special thanks to our sound editor Willow Ruby Young for making a sound amazing.
01:36:29 Speaker 1
Jonathan Boyle is a real life person who I paid money to.
01:36:34 Speaker 1
I don’t know him at all to write her theme song like a *** ***.
01:36:36 Speaker 1
And my favorite joke is to say.
01:36:38 Speaker 1
I know you can just like go online.
01:36:40 Speaker 1
It was like 100 bucks to get the rights to his song and so cause some people use like AI to make music.
01:36:45 Speaker 1
And I was like, I’m gonna pay a musician anyway.
01:36:47
OK.
01:36:47 Speaker 1
He was awesome.
01:36:48 Speaker 1
Like again, I have no idea who he is, but I love that the song is like a ******.
01:36:52 Speaker 1
Because I think, like we’re all ******** who are out there trying to do the Social Research in the face of like, you know, Monsanto or whatever.
01:36:59 Speaker 1
Anyway, I’m philanthropy and this has been the doing Social Research podcast.
01:36:59
The.
01:37:03 Speaker 1
I always remember keep keeping it real and keep doing Social Research.
01:37:06 Speaker 1
Bye. Hi.
01:37:08 Speaker 1
And.
01:37:14
No.
00:00:01 Speaker 1
Hello and welcome to the doing Social Research podcast where I talk with some of my favorite people who do Social Research to dig into the cool projects they’re working on.
00:00:03
No.
00:00:10 Speaker 1
My goal is to help demystify research for students, inspire other researchers, and provide platform for all the brilliant work of folks doing research in the humanities and social sciences.
00:00:16
And.
00:00:20 Speaker 1
I’m your host, Phyllis Rippee, a professor of sociology, the University of Ottawa and creator of the website doingsocialresearch.com.
00:00:27 Speaker 1
But today we are not here to talk about me.
00:00:29 Speaker 1
We’re here to talk with a brilliant and amazing doctor, Kelly Bronson.
00:00:33 Speaker 1
Doctor Bronson is an associate professor at the School of Sociological and Anthropological Studies at the University of Ottawa, where she also holds the prestigious position of Canada Research Chair tier 2IN science and society.
00:00:43 Speaker 1
As a social scientist, Doctor Bronson focuses on the complex interactions between science, technology, and society, particularly around controversial technology such as GMOs, fracking, big data and AI, which I am I put this in the notes like 100 times.
00:00:57 Speaker 1
But I’m especially excited about the AI stuff because I keep asking everyone in.
00:01:00 Speaker 1
You’re the first person who’s actually an expert on it, but I’m so very excited about that.
00:01:05 Speaker 1
Her research aims to bridge the gap between technical knowledge and community values, fostering evidence based decision making that incorporates diverse perspectives.
00:01:12
No.
00:01:13 Speaker 1
Doctor Bronson has published extensively in regional, national and international journals, and her work has significantly contributed to understanding the sociopolitical dimensions of emergent technologies.
00:01:22 Speaker 1
Before joining the University of Ottawa, doctor Bronson directed the Science and Technology Studies program at Saint Thomas University of New Brunswick.
00:01:29 Speaker 1
She holds a PhD in communication and cultural studies from York University and a master’s degree in Science and technology from the University of Saskatchewan.
00:01:36 Speaker 1
Additionally, she has a background in biology, having worked as a genetics and plant biology lab scientist at Queens University.
00:01:43 Speaker 1
Doctor Bronson has actively involved in various advisory roles on large research grants and sits on the editorial boards of several academic journals.
00:01:50 Speaker 1
She also leads the Canadian Network for Science and Democracy, collaborating with international counterparts to to promote responsible innovation, and one of my favorite things about Kelly is that despite all of her brilliance and all of this fancy stuff, she’s just amazing.
00:02:04 Speaker 1
Talked about the realities of combining this kind of fancy.
00:02:07 Speaker 1
Work with the realities of being a normal human being.
00:02:10 Speaker 1
Faced with the kinds of neuroses, fears, stresses, strains of life of the modern world that we all have to deal with, and so to borrow from RuPaul, you give not just professional professorial realness, but also like really real realness.
00:02:21
It’s.
00:02:23 Speaker 1
So anyway, I’m very excited to dig into your work and to chat about it, especially the AI stuff, as I said.
00:02:29 Speaker 1
But yeah, just to welcome to my podcast.
00:02:32 Speaker 2
Thank you.
00:02:33 Speaker 2
Thanks, Phyllis.
00:02:34 Speaker 2
I’m so happy to be here and I feel like we should.
00:02:38 Speaker 2
Take a quick sidebar and justice set the stage a little bit because this is very Ottawa.
00:02:43 Speaker 2
We’re both here.
00:02:44 Speaker 1
Oh my God.
00:02:45 Speaker 2
Yeah, like physically present.
00:02:47 Speaker 2
And you know, for listeners, they can’t see behind you.
00:02:50 Speaker 2
But we’re looking at a wall of snow.
00:02:52 Speaker 1
Oh my God, it is.
00:02:52 Speaker 2
There’s no visibility.
00:02:53 Speaker 2
There’s such a huge epic snowstorm or Blizzard happening outside your office window.
00:02:58 Speaker 2
Outside place of work, it looks like we may be trapped here.
00:02:58 Speaker 1
I.
00:03:00 Speaker 1
5 almost.
00:03:01 Speaker 2
We may be stuck here talking about AI and agriculture for days.
00:03:04 Speaker 1
I know.
00:03:05 Speaker 1
I feel like I hope.
00:03:06 Speaker 1
I don’t know what what?
00:03:07 Speaker 1
Maybe it’s like our job.
00:03:08 Speaker 1
Is to like clear the fog.
00:03:10 Speaker 2
And that’s maybe, maybe not just metaphorically as we as we talk, he the sky will part and also we both accidentally worn but hop.
00:03:10 Speaker 1
Yes.
00:03:11 Speaker 1
You know.
00:03:13 Speaker 1
Right, exactly.
00:03:16
Please.
00:03:17 Speaker 1
Yes.
00:03:19 Speaker 1
Oh my God, I love it.
00:03:21 Speaker 2
Pink hot pink.
00:03:22 Speaker 2
Same colour.
00:03:23 Speaker 2
So funny, it’s not really seasonal.
00:03:24 Speaker 1
I know I love it.
00:03:26 Speaker 2
We were just 20s.
00:03:27 Speaker 1
I.
00:03:27 Speaker 2
Know and also RuPaul. Can I just say so.
00:03:30 Speaker 2
We’re not going to talk about.
00:03:31 Speaker 2
Today.
00:03:31 Speaker 2
But um, I in my off hours.
00:03:34 Speaker 2
I now teach a fitness class.
00:03:36 Speaker 2
You know, way Pilates.
00:03:36
Did.
00:03:37
Yeah.
00:03:38 Speaker 1
Oh my God.
00:03:38 Speaker 2
And litter at a gym called where I thrive.
00:03:39 Speaker 1
Where I want to take it.
00:03:40 Speaker 2
You should ohm Y God.
00:03:41
Yeah.
00:03:41 Speaker 1
I’ve been there cause milenko’s.
00:03:42 Speaker 2
OK.
00:03:43 Speaker 1
And I I got like a free one month.
00:03:43 Speaker 2
You say does for there.
00:03:45 Speaker 2
Well, you could be my guest.
00:03:46 Speaker 1
Can’t speak ohe my God.
00:03:47 Speaker 2
Um, but I have a new playlist every month is yes, new playlist, new month, new playlist, December.
00:03:51 Speaker 1
OK, love it.
00:03:53 Speaker 2
Had a playlist started my playlist.
00:03:55 Speaker 2
When was this on Sunday? Monday.
00:03:56 Speaker 1
OK, OK.
00:03:58 Speaker 2
Hadn’t quite listened through all of the songs.
00:04:00 Speaker 2
Yep, put a RuPaul song on there.
00:04:02 Speaker 2
Hmm, we’re halfway through this like core exercise, and I realize this song is so spicy.
00:04:10 Speaker 2
Like it has some real inappropriate language.
00:04:15 Speaker 2
Yeah.
00:04:16 Speaker 2
So I had to hop up and you know, I don’t want to censor RuPaul, but.
00:04:20 Speaker 2
Just just.
00:04:21 Speaker 1
That that is.
00:04:21 Speaker 2
That’s too spicy for 6:45 on a Monday morning.
00:04:25 Speaker 1
That is so funny.
00:04:26 Speaker 1
I often have.
00:04:27 Speaker 1
I will often when I’m teaching, I usually play a song, especially when I’m teaching like stats or something.
00:04:33 Speaker 2
I’ve got it open.
00:04:33 Speaker 1
It’s kind of boring, so I try to.
00:04:34 Speaker 1
Like you know, with a song.
00:04:36 Speaker 1
Yes.
00:04:37 Speaker 1
Yeah.
00:04:37 Speaker 2
Is that have meaning like it was girl Anthem post election or?
00:04:37 Speaker 1
Meaning, when some, well, sometimes it’ll be more like super like dork.
00:04:46 Speaker 1
Like dad joke.
00:04:47 Speaker 1
Stuff like when we’re talking about like measures of central tendency, I’ll play the song the middle.
00:04:48
OK.
00:04:52 Speaker 1
That’s like, why can’t you?
00:04:54 Speaker 1
Meet me in the middle.
00:04:55 Speaker 1
I’m really ****** that up right now.
00:04:57 Speaker 1
Sorry, I’m also trying not to swear as much OHS on this podcast and I cannot stop because I hope I want the university to like promote it.
00:04:58
That’s great.
00:05:04 Speaker 1
But eventually, once I have an episodes, but I’m like they won’t listen to it, they’ll just be like, oh, professor.
00:05:09 Speaker 1
And then.
00:05:10 Speaker 2
Just keep it out.
00:05:10 Speaker 1
People I know, I did do that.
00:05:13 Speaker 1
On my episode with Ivy cause she is so professional she doesn’t like swear and I was like so I I beeped out some of my swears cause there weren’t.
00:05:22 Speaker 1
I was like on better behavior.
00:05:22 Speaker 2
In contrast, didn’t deserve it this.
00:05:24 Speaker 1
Yeah, right.
00:05:25 Speaker 1
Other times I just say explicit anyway anyway, but before my class, and so I tried.
00:05:27
OK.
00:05:31 Speaker 1
There’s certain songs where I’m like, Oh my God, I want to play that.
00:05:33 Speaker 1
But like there’s certain like, I’m not going to play it if it has the N word like I’m not like, judging like I I’m not gonna tell black people what to do, but as a white woman, I’m not gonna be doing that.
00:05:37 Speaker 2
And Oh yeah.
00:05:40 Speaker 2
Bitpop culture.
00:05:41 Speaker 2
Yeah, no, not appropriate.
00:05:44 Speaker 1
Or songs that could be triggering, you know, like I’m trying to be thought.
00:05:46 Speaker 2
Yeah, yeah.
00:05:47 Speaker 1
But it is like it’s hard, so I’m always asking people like students in the class or my kids.
00:05:52 Speaker 1
I’m like, well, I get cancelled.
00:05:54 Speaker 1
Who is this cause?
00:05:54 Speaker 1
Also my songbook is from like 1995 and so I’m always like hey could I play?
00:06:00 Speaker 1
Kendrick Lamar was one recently and then I was like, no.
00:06:03 Speaker 1
No, I cannot.
00:06:04 Speaker 1
I was like, I think he’s awesome.
00:06:06 Speaker 1
Yeah.
00:06:06 Speaker 1
And I think what he does is awesome, but no, no, I will not be able to play any of those songs.
00:06:12 Speaker 1
So anyway, but it’s it’s hard because it’s like you want to set the vibe and it’s.
00:06:12
Yeah, it’s tricky.
00:06:18 Speaker 1
Anyway, that’s the whole thing.
00:06:18 Speaker 2
Yeah, I think F word is fine, but, you know, derogatory terms like to see where the B word and totally and obviously the end word.
00:06:22
Hmm.
00:06:26 Speaker 1
Yeah.
00:06:26 Speaker 2
Anyway, sorry that was a bit of a side part.
00:06:27 Speaker 1
No, I’m just.
00:06:28 Speaker 1
No, I love it.
00:06:29 Speaker 2
Thank you for that introduction.
00:06:29 Speaker 1
I love it.
00:06:31 Speaker 2
That was very kind and I love that you said prestigious research.
00:06:35 Speaker 2
Here it is prestigious.
00:06:36 Speaker 1
It was.
00:06:37 Speaker 1
I’ll just admit that there was some AI involved in the writing of it, but I read it.
00:06:42 Speaker 1
Did I love telling this story?
00:06:44 Speaker 1
And I think again, we’ll get to my first question, but.
00:06:47 Speaker 1
When I was an undergrad, I get very angry about cheating and plagiarism when students do it, and I think it’s because like I always was like kind of a do gooder like I wasn’t an A+ student by any stretch, but I would never cheat and so it really made me mad.
00:07:02 Speaker 1
And I remember that my now ex-husband we went to undergrad together.
00:07:07 Speaker 1
We.
00:07:07 Speaker 1
And he had a friend who paid for a paper.
00:07:10 Speaker 1
So this was in the 90s before the Internet.
00:07:11 Speaker 2
Oh wow.
00:07:12 Speaker 2
Those early days that kind of falsification, yeah.
00:07:12 Speaker 1
She right.
00:07:13 Speaker 1
She right.
00:07:15 Speaker 1
And she paid something like 700.
00:07:17 Speaker 2
Whoa.
00:07:17 Speaker 1
Dollars or five?
00:07:18 Speaker 1
I can’t remember.
00:07:19 Speaker 2
So that’s wrong on so many levels, just the.
00:07:19 Speaker 1
I remember right.
00:07:20 Speaker 1
If felt like a lot and she got a seat on it and she was really mad cause she paid so much money for it and I think I know well.
00:07:24
Oh wow.
00:07:25 Speaker 2
Ha ha.
00:07:27 Speaker 2
Ah.
00:07:30 Speaker 1
But then I paid good money for this.
00:07:31 Speaker 1
Exactly.
00:07:32 Speaker 1
And my ex husbands like but you didn’t write it.
00:07:32
No.
00:07:33 Speaker 1
She’s like, well, I read it.
00:07:34 Speaker 1
And I agreed with it.
00:07:35 Speaker 1
And she literally took it all the way, and she brought to the professor.
00:07:38 Speaker 1
They didn’t.
00:07:39 Speaker 1
Visit enough brought it to the department to write, and she, like, brought it to like the highest levels and eventually got it raised to like a B plus or something.
00:07:45 Speaker 1
And I’m like, I know, right?
00:07:46 Speaker 2
That’s hilarious.
00:07:47 Speaker 2
Of course you care about cheating, not just because I mean most of us who end up where we are.
00:07:52 Speaker 2
We’re probably sticker children who were externally motivated on some level and, you know, got a lot from getting, doing, working hard and doing well.
00:07:54 Speaker 1
Next 8 ohm totally.
00:07:58 Speaker 2
In school so.
00:07:59 Speaker 1
Yes.
00:08:00 Speaker 2
So you care about justice.
00:08:01 Speaker 1
Exactly.
00:08:01 Speaker 2
Factor doing Social Research, right?
00:08:02 Speaker 1
Exactly like.
00:08:03 Speaker 2
Whenever my kids are.
00:08:04
That’s not fair.
00:08:06 Speaker 2
This is good.
00:08:06 Speaker 2
This shows that you care instead.
00:08:06 Speaker 1
Yeah, I know.
00:08:08 Speaker 1
I know, although I’ve come to as a mother, I’ve come to be like.
00:08:11 Speaker 1
Yeah, life’s not fair.
00:08:12 Speaker 1
Get used to it.
00:08:13 Speaker 2
All that for two, and the distinctions between equity or equality, you know, or equality and equity.
00:08:13 Speaker 1
I am the worst.
00:08:19 Speaker 1
Totally.
00:08:19 Speaker 2
Fairness, right?
00:08:20 Speaker 2
Doesn’t mean everyone gets the same.
00:08:22 Speaker 1
Exactly.
00:08:22 Speaker 2
That’s hard for you.
00:08:23 Speaker 2
You have multiple children, right?
00:08:24 Speaker 2
And I find this hard with my two kids, who are very different in age and ability.
00:08:24 Speaker 1
So many, yeah, yeah.
00:08:28 Speaker 2
And um, it’s tricky.
00:08:30 Speaker 1
It really is.
00:08:31 Speaker 2
Of course, you’re not gonna get the same.
00:08:32 Speaker 1
Yeah, everybody has different needs, right?
00:08:33 Speaker 2
That’s that’s more than you Oscars 13.
00:08:35 Speaker 1
Yeah, totally.
00:08:37 Speaker 2
Totally you can do more on ought to do more anyway.
00:08:41 Speaker 1
Anyway, so let’s.
00:08:42 Speaker 1
But this is all this is all relevant.
00:08:44 Speaker 1
This is all awesome.
00:08:44 Speaker 1
So I always start with my my titular question, which is a word that makes me giggle.
00:08:49 Speaker 2
Yeah, anywhere Speaking of spicy.
00:08:50 Speaker 1
Anyways, me Twitter titular question.
00:08:54 Speaker 1
And you.
00:08:54 Speaker 1
I’m tryna I wanna become like super mature.
00:08:57 Speaker 1
You know I’m.
00:08:58 Speaker 1
I’m so mature and I’m, you know, I’m trying to be the, you know, the Auntie Joe Rogan.
00:09:04 Speaker 1
Like first, prov.
00:09:05 Speaker 1
Super influencer, so I gotta build my brand.
00:09:06 Speaker 2
Love it.
00:09:08 Speaker 1
This so being doing Social Research.
00:09:08 Speaker 2
OK.
00:09:10 Speaker 1
So tell me what a research are you doing these days?
00:09:14 Speaker 2
OK these days.
00:09:16 Speaker 1
Or anytime like really I I do it just for branding, but I don’t. I I wanna talk about ever research.
00:09:21 Speaker 2
You wanna talk about?
00:09:22 Speaker 2
So I am I call myself now a reluctant AI scholar.
00:09:27 Speaker 1
OK, I love it.
00:09:28
Yes.
00:09:29 Speaker 2
Um, everyone, of course, is an AI scholar.
00:09:32 Speaker 2
Yeah, but and as you know, I have this book.
00:09:35 Speaker 2
Which we will probably talk about hand.
00:09:37 Speaker 1
Oh my God, I didn’t mention your book in the thing.
00:09:39 Speaker 1
Which now I’m mad at myself, but also I haven’t sitting right in front of me.
00:09:42 Speaker 1
I just want to pause for a second because also I talked about this with somebody else on another episode.
00:09:48 Speaker 1
Is book covers and I know you and I had a lot of like stress and strain, but I actually think you’re your book ever is really cool.
00:09:50 Speaker 2
I did just.
00:09:52 Speaker 2
That’s a lot of strength.
00:09:54 Speaker 1
Anyway, the book is called and it has one of the best titles.
00:09:57 Speaker 1
I just love it.
00:09:59 Speaker 1
The Immaculate Conception of data agribusiness activists and their shared politics of the future.
00:10:05 Speaker 1
This book came out in 2021-2022 with McGill, Queens University Press, and it’s an awesome book, and I want to.
00:10:10 Speaker 2
That’s forever, yeah.
00:10:14 Speaker 1
I want to get into it, so I I know we did.
00:10:15 Speaker 2
We shared a press, we shared another, um, my friend just did the collage on the cover.
00:10:17 Speaker 1
We had a shared editor and it’s very cool.
00:10:21 Speaker 1
No way.
00:10:22 Speaker 2
Yeah.
00:10:22 Speaker 1
I did not.
00:10:22 Speaker 2
She’s brilliant.
00:10:23 Speaker 1
Farts.
00:10:23 Speaker 2
Artist.
00:10:24 Speaker 2
Yep.
00:10:24 Speaker 1
I love it.
00:10:25 Speaker 2
And so that was nice.
00:10:26 Speaker 1
It’s.
00:10:27 Speaker 1
Yeah, I hate my book cover.
00:10:28 Speaker 2
But before you do, I don’t hate it.
00:10:28 Speaker 1
Yours. I just think it like captures.
00:10:31 Speaker 1
It’s just if you look at a thumbnail, you can’t see the title.
00:10:34 Speaker 1
It’s like it’s like it’s too there’s too much happening on it, but anyway.
00:10:35 Speaker 2
Yeah.
00:10:40 Speaker 1
Anyway, so let’s talk about the research you’re doing lately and about your book, but also stuff more.
00:10:43 Speaker 2
Yeah.
00:10:45 Speaker 1
All it’s it’s all related.
00:10:45 Speaker 2
OK, what am I doing so I’m I continue to work on this thing called a I you I I’m nervous about claiming expertise.
00:10:58 Speaker 2
You said I’m so excited to have an expert on AI here.
00:11:02 Speaker 2
Only because I would say that I am perhaps an expert, or at least I research with some kind of depth.
00:11:11 Speaker 2
The social dimensions of artificial intelligence, the kind of social and more so social justice impacts which I’m always keen to distinguish that from, say, the ethics of AI, which is a big part of the kind of social conversation which means the conversation that ends up being critical, critical as an analytical right, not necessarily.
00:11:14 Speaker 1
Yeah.
00:11:17
Hmm.
00:11:23
Oh.
00:11:24 Speaker 1
Interesting.
00:11:32 Speaker 2
Negative gets, I think, hamstrung or.
00:11:35 Speaker 2
Circumscribed around well, to be honest, law like, yes.
00:11:40 Speaker 2
Yeah.
00:11:41 Speaker 2
So anyway, that’s a whole other thing we could talk about that, yeah.
00:11:42 Speaker 1
No, no, no, not anyway.
00:11:43 Speaker 1
No, no.
00:11:44 Speaker 1
But but just for a SEC, cause I think that that’s also important for distinguishing between Social Research and like other kinds of research, but also think like I think that’s a really I never really thought about it that way.
00:11:55 Speaker 1
But like, just tease that out a little bit more for us about like the difference between ethics and then like the social implications of something like, you know, cause there is like this.
00:12:04 Speaker 2
Right.
00:12:05 Speaker 1
Like there’s morals, there’s ethics.
00:12:07 Speaker 1
And then there’s also, just like facts of what consequences on different social groups and like so.
00:12:13 Speaker 2
Yeah, exactly.
00:12:13 Speaker 1
She’s that out, yes.
00:12:14 Speaker 2
And it’s the latter that I’m more just like you, right?
00:12:16 Speaker 2
That’s what I care about.
00:12:19 Speaker 2
Is really power fundamentally.
00:12:22 Speaker 2
Um, yeah, there are different ways I guess, to tease that out, that distinction.
00:12:26 Speaker 2
But in a really practical way.
00:12:28 Speaker 2
So this was years ago when I was working on the book.
00:12:31 Speaker 2
And it’s interesting, I think that the book has data in the title because when I started the book in 2016, everyone, not just social researchers or those doing Social Research, but right, members of the public, we’re talking about data and everyone, it was like around just before, I suppose, but 20/17/2018 there was increasing awareness around the.
00:12:35 Speaker 1
I know, yeah.
00:12:52 Speaker 2
Says and misuses of our personal data, mostly collected from online environments.
00:12:57 Speaker 2
Right.
00:12:57 Speaker 2
Cambridge Analytica scandal and the whole, you know, gaming of the American election.
00:12:59
Yes.
00:13:02 Speaker 2
Electoral politics more broadly.
00:13:03 Speaker 2
Brexit, you know, and we had, like Facebook testifying before Congress, Collins Stretch and that really weak test tested testimony before US.
00:13:04 Speaker 1
Facebook.
00:13:13 Speaker 2
Congress, and then of course, we had Mark Zuckerberg, so there was just people were sort of aware and then I think even Elon Musk, which I think it.
00:13:22 Speaker 2
Retrospect is quite funny, but you know he he.
00:13:24 Speaker 1
What did they say?
00:13:25 Speaker 1
I don’t remember.
00:13:25 Speaker 2
Well, he I’m pretty sure that he’s started this campaign online.
00:13:29 Speaker 2
Quit hashtag, quit Facebook.
00:13:31 Speaker 2
And it was like, we’re all going to.
00:13:32 Speaker 1
What?
00:13:34 Speaker 2
I just remember my students really being concerned and curious.
00:13:38 Speaker 2
And you know, I had this book that’s MIT press, this tiny little black book called obfuscation.
00:13:44 Speaker 2
And people were like, actively trying to write find different ways of behaving online to make themselves less visible or less amenable to the kind of harvesting, which is what designer, as you zuboff calls.
00:13:58 Speaker 2
It Shoshana Zuboff sorry of of personal data.
00:14:02 Speaker 2
So anyway, everyone was talking about data, right?
00:14:04 Speaker 2
We were having like, a real moment of awareness and and public visibility.
00:14:09 Speaker 2
And so my whole book is about data, yeah.
00:14:12 Speaker 2
But really it could be it could be AI.
00:14:14 Speaker 2
Now everyone talks about I.
00:14:16 Speaker 2
Yeah.
00:14:17 Speaker 2
And I actually think, and I asked an historian of AI, Luke Stark, about this two years ago.
00:14:24 Speaker 2
And he said yes, I think I agree.
00:14:25 Speaker 2
But I don’t know if anyone has actually charted this, I actually think.
00:14:29 Speaker 2
The industry has sort of led social researchers, even critical social researchers, away from the data conversation, right?
00:14:36 Speaker 1
That’s so interesting.
00:14:36 Speaker 2
So now there’s, like, critical AI studies.
00:14:38 Speaker 2
But you know the the journal where people publish is still big data and society.
00:14:43 Speaker 2
Basically, everyone stopped talking about data and started talking about AI, and I actually think maybe some of that was.
00:14:50 Speaker 2
Tactical switch on the part of industry, right is like everyone was aware of data and so then people started industries.
00:14:56 Speaker 1
Like big Data is working.
00:14:57 Speaker 2
Yeah, industry started leading us toward conversations around AI and then leading the critical conversation, right?
00:15:02 Speaker 2
Those like open letters around AI and the concern actually this comes back to this is a super secure just answer to your question, but it comes back to your question about about what’s the distinction social justice versus the other kinds of maybe philosophical question.
00:15:10 Speaker 1
No, no, no.
00:15:10 Speaker 1
This is awesome.
00:15:16 Speaker 2
Mens, I actually think in this move and again I have not looked at this systematically.
00:15:16
Yeah.
00:15:22 Speaker 2
Um, but this move toward talking about AI as opposed to data?
00:15:27 Speaker 2
I suspect it’s sort of been led by industry and you have these open letters, right, that industry folks like Sam Altman have written on AI and some of the potential negative consequences.
00:15:39 Speaker 2
And I think if one were to look at it systematically, the consequences are always those big kind of philosophical or maybe moral questions, right?
00:15:47 Speaker 2
Like, what’s the distinction between human reasoning and computer reasoning, and is this the end of human reasoning?
00:15:53 Speaker 2
And are robots going to turn on us?
00:15:55 Speaker 1
Yeah, yes, yes.
00:15:55 Speaker 2
And like the existential right?
00:15:58 Speaker 2
And then the conversation is always like, oh, but don’t worry, we got this right, like in those open letters.
00:16:02 Speaker 1
Rates. Rates.
00:16:03 Speaker 2
It’s always like big existential threat, and we know the power of these tools and we know them best and so leave it to us right.
00:16:09 Speaker 1
Yes.
00:16:11 Speaker 2
It kind of really conditions that governance space around AI and I, but I think all this to say.
00:16:18 Speaker 2
It’s weird that my book is about data because it really could be.
00:16:21 Speaker 2
It is a book about AI and we can come back to talking about that in a second.
00:16:22
Hold on.
00:16:25 Speaker 2
But back to the conversation about or the question about social justice versus the kind of ethics.
00:16:32 Speaker 2
I think that’s a part of it that I’m always trying to say.
00:16:35 Speaker 2
Yes, there are those really important questions that probably philosophers mostly should puzzle through.
00:16:42
Yeah.
00:16:43 Speaker 2
Maybe in concert with the technologists right about, like the distinctions, but humans and, you know and like, how do we create?
00:16:51 Speaker 2
A good as in functioning, but also good for a large number of people.
00:16:55 Speaker 2
Human machine compromise and how do we set limits around computer reasoning such that, you know we don’t have the eye robots scenario, but that’s a reference that late 90s film with Will Smith, who signed it think cancelled, but um.
00:17:09 Speaker 1
He’s always he’s cancelled his necklace so that I never.
00:17:10 Speaker 2
But if you cancel it, I don’t know.
00:17:12 Speaker 1
Never.
00:17:13 Speaker 2
I mean, he did.
00:17:14 Speaker 2
Perform an act of physical violence in public.
00:17:16 Speaker 2
Against him, it was it, a woman.
00:17:18 Speaker 2
No, no it wasn’t.
00:17:18 Speaker 1
Who was a guy who’s in defence of his wife?
00:17:19 Speaker 2
It was offensive woman, true.
00:17:21 Speaker 1
It’s it’s a tricky 1.
00:17:22 Speaker 2
Was it tricky one I know.
00:17:23 Speaker 2
Anyway, sorry but, but like yeah so so.
00:17:27 Speaker 2
So.
00:17:28 Speaker 2
You know, I think it’s a very practical.
00:17:29 Speaker 2
It’s like who gets to enter into these conversations around?
00:17:33 Speaker 2
What are the risks and how do we define the risks and those two are related, right?
00:17:36 Speaker 2
How we define them then matters for who gets involved in the conversation about how to best mitigate them, and if we define the risks as only these big moral existential rate and not like who’s made more powerful by these technologies, whose disempowered right and do these tools fundamentally reproduce harm to particular, you know, historically made marginal social group.
00:17:59 Speaker 1
Yes.
00:18:04 Speaker 2
This the kind of power questions and the justice questions.
00:18:07 Speaker 2
Those are the the things that I work on and that I’m fundamentally concerned about.
00:18:11 Speaker 2
But yeah, I’m both are reluctant AI scholar.
00:18:15 Speaker 2
And then I’m also partly because of my candidate research here title.
00:18:17 Speaker 2
And because we happen to be in Ottawa, continuously pulled into the governance space, which is a great space to be, and there’s lots happening and it’s all important.
00:18:26 Speaker 2
But I’m often the only social scientist or person doing Social Research in the room with a bunch of lawyers.
00:18:33 Speaker 2
Love lawyers?
00:18:33 Speaker 2
Really important, right?
00:18:35 Speaker 2
But you know, the conversation is very much like either around how do we set the bar or how do we compare the outcomes from this particular AI to.
00:18:47 Speaker 2
To the law, which is basically setting the bar around legal compliance, right?
00:18:49 Speaker 1
Yeah.
00:18:52 Speaker 2
Right.
00:18:53 Speaker 2
Or how do we maybe some of the conversation now is like stretched to how do we compare the uses and potential future uses of this tool against things like the equity Act?
00:19:05 Speaker 2
Which would is the best case, but mostly it’s human rights, and that’s really important too.
00:19:10 Speaker 2
But I’m always like, let’s just let’s think a bit broader, right?
00:19:13 Speaker 2
Beyond legal compliance and beyond.
00:19:16 Speaker 2
Historic legislation because legal tools are inherently conservative to think about what are the best outcomes for the most number of people, and specifically for people who are historically or currently made marginal.
00:19:27 Speaker 1
Oh my God, I love this so much.
00:19:29 Speaker 1
And this is like I have so many thoughts and this is like like when everybody to listen to every episode of my podcast, not just so I can build my brand, but because it is so fascinating to hear you at this point after talking to so many people with different attitudes differ.
00:19:44 Speaker 1
And approaches different feelings and and just like various thoughts on it and like one thing that has come up in multiple episodes is my love of Hannah Arendt and the book the human condition at, Oh my God.
00:19:55
Oh, such a good.
00:19:56 Speaker 2
Bug, but in fact a preference is almost my favorite part of that book where.
00:20:00 Speaker 1
She budnick. Exactly. This is like, I was just.
00:20:00 Speaker 2
Talks about Sputnik.
00:20:02 Speaker 1
I was like Willow brought it.
00:20:03 Speaker 1
Willow scobie, in another episode brought brought up, I think, really.
00:20:06 Speaker 1
Yes, yes.
00:20:07 Speaker 2
That’s interesting.
00:20:07 Speaker 1
And I brought it up with.
00:20:08 Speaker 2
Of course we’re.
00:20:09 Speaker 1
All alright, but it is like her whole point and this is what I just keep saying is her question.
00:20:09 Speaker 2
My colleague.
00:20:13 Speaker 1
The whole thing about spin, it’s like it’s it’s a simple, modest question and then a very thick, very dense long book.
00:20:19 Speaker 1
I just want us to think, what are we doing?
00:20:22 Speaker 1
And it’s like and you can see when she’s talking about the ways exactly.
00:20:26 Speaker 2
Why?
00:20:27 Speaker 1
It’s like, why are we doing this?
00:20:29 Speaker 1
Like what brought us here and and I also super interesting like I started I just started to listen to cause I keep getting these arguments with my kids about like what was the cause of the most recent US election and I so I decided to start listening to the audio book of the Origins of Totalitarianism by aren’t and.
00:20:49 Speaker 1
I’m and then I was like, Oh my God, this is so good.
00:20:53 Speaker 1
But then, because I’m a nerd mom, I made my kid read to me.
00:20:56 Speaker 1
I was like they were.
00:20:57 Speaker 1
He wanted to come over for dinner and I was like, OK, but I’m like, doing stuff.
00:21:02 Speaker 1
So here put your computer and find me a critique of this, because before I’m going to start telling you that this is the right thing.
00:21:07 Speaker 1
Find me a credit.
00:21:08 Speaker 1
He’s a sociology major in 30 year right now.
00:21:11 Speaker 1
Yeah, it’s like, so fun.
00:21:12 Speaker 2
I love that you’ve created a critical conversation for him.
00:21:15 Speaker 2
I know parenting.
00:21:15 Speaker 1
And it’s like, that’s part of what frustrates me in our debates about the election is.
00:21:21 Speaker 1
A kind of certainty that they have with their ideas, and I have often been accused by various husbands and children that it’s like you just think what you think and you’re like, you’re not critical of yourself.
00:21:33 Speaker 1
You just like, aren’t listening.
00:21:34 Speaker 1
Other ideas and and also like people who will say like you know, accuse me of like wokeness.
00:21:40 Speaker 1
And I’m like, oh, my God, my kids call me.
00:21:42 Speaker 1
I’m like just a dumb Lib that I’m like not woken up.
00:21:45 Speaker 2
Term now, yeah.
00:21:45 Speaker 1
Hmm I.
00:21:46 Speaker 1
Yes, that it’s like cause I’m not left enough for them, right.
00:21:48 Speaker 2
Liberalism offensive, OK.
00:21:50 Speaker 1
So it’s I’m too like moderate basically.
00:21:54 Speaker 1
And so I’m like, look, I’m like to me, social science, like any science, is about trying to falsify our claims.
00:22:01 Speaker 1
To me it is always about trying to disperse anyway, so I’m like, Jack, find me a critique and pretty quickly I was like.
00:22:06 Speaker 1
This is done and I’ll tell you why, but keep reading anyway, so I’m like I want good evidence, but what was really helpful is he’s the guy, the article who wrote it.
00:22:15 Speaker 1
I shouldn’t say he was dumb.
00:22:16 Speaker 1
It was.
00:22:17 Speaker 1
There were certain things that I disagreed, but I wasn’t looking at.
00:22:19 Speaker 1
I was like listening and cooking dinner, so my apologize to this person.
00:22:22 Speaker 1
I probably should put him in this show.
00:22:24 Speaker 1
Notes he’s an expert on, Hannah went.
00:22:26 Speaker 1
But one thing that he said that I think makes sense given what I’ve read about the human condition, is that what is?
00:22:32 Speaker 1
What explains totalitarianism?
00:22:36 Speaker 1
One of the things that allows totalitarian leaders to take form to come to power is their ability to transcend context.
00:22:45 Speaker 1
And one of the things that I love about the human condition is, and her critique of science and technology is this idea that we as humans are constantly trying to transcend the human condition that we are trying to escape.
00:22:57 Speaker 1
That why spending matters like we want to go to outer space.
00:23:00 Speaker 1
So we can avoid being human.
00:23:01 Speaker 1
And she’s like.
00:23:01 Speaker 1
I’m sorry friends, we can’t and.
00:23:03 Speaker 2
Now you’ve got to stay with the trouble.
00:23:05 Speaker 2
As Donna here with some.
00:23:06 Speaker 1
Exactly.
00:23:07 Speaker 1
And So what I just.
00:23:09 Speaker 1
And it was like, holy ****.
00:23:10 Speaker 1
And I think that what you’re describing is like with the whole big data and this shift to AI.
00:23:17 Speaker 1
It’s like what people in big tech in these firms are doing as they are transcending like they have the resources in the capacity to transcend context.
00:23:27 Speaker 1
That what they’re doing and it’s like, holy ****, you are so right.
00:23:30 Speaker 1
Like as I’m thinking about it like my 2016 big data was everything like that was everywhere.
00:23:35 Speaker 1
And so, of course it’s on the tide in the title of your book.
00:23:39 Speaker 1
But I want to know more like so to me that like is a quintessential.
00:23:45 Speaker 1
Characteristic of effective rhetoric is, and also the last recording I did was with a dear old friend of mine, Maggie Werner, who’s a rhetorician at William and Robert Smith colleges in upstate New York.
00:23:58 Speaker 1
And she does all kinds.
00:23:59 Speaker 1
And because she teaches a lot of writing, I was like, what are your thoughts on AI for writing like?
00:24:03 Speaker 1
And she’s like, you know, I think a lot of it is a moral panic.
00:24:06 Speaker 1
I think it’s awesome.
00:24:07 Speaker 1
I use it all the time.
00:24:07 Speaker 1
It’s really helpful.
00:24:09 Speaker 1
And so it’s so fascinating to me to hear you talk.
00:24:13 Speaker 1
This is my very long winded answer to a question that nobody asked.
00:24:19 Speaker 1
That it’s like this idea that we have that there has been this shift that does feel like there’s a kind of moral panic around AI in the way that there was a moral panic around.
00:24:33 Speaker 1
Like big data, but they’re also, is there is something real behind it that there is, but it’s like we’re being distracted by like there are problems with it, but not the problems that we think are the problems.
00:24:48 Speaker 1
So can you tell me more?
00:24:49 Speaker 2
Yeah, that’s a really good way.
00:24:49 Speaker 1
Like, tell me more, what are the problems?
00:24:51 Speaker 2
The distraction is a great I think of that, you know, like a.
00:24:53 Speaker 2
Magician like Presto digitation.
00:24:55 Speaker 2
Which makes me sound like a real conspiracy theorist, and I don’t think it’s like, you know, one person like Sam Altman sitting in a chair saying like, hey, folks, you know, we need to talk about AI and reframe the whole conversation.
00:24:58 Speaker 1
No.
00:25:01
Yeah, yeah.
00:25:05 Speaker 1
No, and it also if I could just pause for one second, it’s like it is in a certain sense, I can be a real.
00:25:06 Speaker 2
But I do, yeah.
00:25:11 Speaker 1
What’s the like rationalist like?
00:25:13 Speaker 1
Social thinking.
00:25:14 Speaker 1
Sociology, like 1950s.
00:25:18 Speaker 1
What’s?
00:25:18 Speaker 1
That like James Coleman.
00:25:20 Speaker 1
I’m forgetting the like theoretical school of thought that I’m claiming to be.
00:25:23 Speaker 2
Like structural functions.
00:25:23 Speaker 1
A part of no.
00:25:26 Speaker 1
But more just like that, there’s like like there’s rational, like rationalism like that.
00:25:32 Speaker 1
There’s a kind of rational thought.
00:25:33 Speaker 1
It’s not a conspiracy.
00:25:34 Speaker 1
It’s not that like Sam Altman or these people are like evil.
00:25:36 Speaker 1
It’s that anyone, like, I’m a pragmatist.
00:25:39 Speaker 1
Like anyone who has that kind of anyone who’s gonna wanna.
00:25:43 Speaker 1
Be self protective.
00:25:44 Speaker 1
Anyone is going to want to do what like will maintain their position of whatever.
00:25:48 Speaker 1
And so it’s not that they’re like evil, it’s that they do evil as a means of just doing business.
00:25:56 Speaker 1
Like, that’s just the nature of it.
00:25:57 Speaker 2
Oh for sure.
00:25:58 Speaker 1
And so yeah.
00:25:58 Speaker 2
It’s business as usual.
00:26:00 Speaker 2
When you say like.
00:26:00 Speaker 2
Yeah, exactly.
00:26:01 Speaker 2
It kind of distraction.
00:26:02 Speaker 2
I think that’s it.
00:26:03 Speaker 2
It’s, you know, for me doing Social Research is it is.
00:26:07 Speaker 2
Is drawing attention to these kinds of ways that power moves right through society.
00:26:11 Speaker 2
The insidious ways the hidden ways, often through rhetoric and language, which is why I left the lab bench and did my degree in communication studies.
00:26:18 Speaker 2
Because I started to realize the place of language mostly in the US courts.
00:26:23 Speaker 2
Um in creating boundaries around who could participate in policymaking around controversial technologies.
00:26:30 Speaker 2
And we can come back.
00:26:31 Speaker 2
For that, but you know, I think in 2016 or 17 or whenever it was that we’re saying there was this conversation around data.
00:26:38 Speaker 2
I think there was more of a conversation, less around the morality and the ethics, and let’s say the philosophy understood broadly and more around the political economy.
00:26:47 Speaker 2
Right, because people back to the business as usual, I think people started to see ah, like the uses of data, the monetization or I think my colleague Keene Birch at York would say acetylation of data sets was the new business model for.
00:27:04 Speaker 2
I mean, let’s call the media companies cause even though they’ll right evade that kind of regulatory distinction, I think they are right.
00:27:13 Speaker 2
And people who are setting the conversation or companies that were setting the conversation like Facebook, and there was, we started to see, I think, as ordinary people.
00:27:21 Speaker 2
Await a second, right?
00:27:22 Speaker 2
Every move online is like, you know, captured or harvested again.
00:27:27 Speaker 2
And then those data are collated and hey, people are making money from my data sets and and using them in many ways.
00:27:34 Speaker 2
Um against against the I don’t know.
00:27:38 Speaker 2
Social good, which is problematic framing.
00:27:40 Speaker 2
But yeah, and so.
00:27:43 Speaker 2
And so yeah, I think that it is a kind of distraction.
00:27:45 Speaker 2
You’re.
00:27:45 Speaker 2
But I think that it was a distraction away from.
00:27:48 Speaker 2
I think the conversation was more about the political economy before we started talking about AI.
00:27:53 Speaker 1
Do you?
00:27:53 Speaker 1
Do you think part of that was because, like just on a like, thinking as a human being, it’s like I get excited about AI because I can see how AI can help me.
00:28:02 Speaker 1
It’s like it.
00:28:03 Speaker 1
Honestly, everyone, the bio was I was running late.
00:28:04 Speaker 2
It’s very powerful.
00:28:06
Yeah.
00:28:07 Speaker 1
I was like, write me a bio.
00:28:09 Speaker 1
Doug Kevin Henson.
00:28:10 Speaker 1
And there you go.
00:28:11 Speaker 1
And I was like, that looks great.
00:28:12 Speaker 1
I of course missed your book, and if I had been better, I would have.
00:28:16 Speaker 1
I would have written it probably a little differently, but it worked.
00:28:19 Speaker 1
It did the job, whereas big data in 20, like you know, Cambridge Analytics, I didn’t do **** for me like that.
00:28:24 Speaker 1
All that did was try to like you know, it gave me better ads or whatever.
00:28:29 Speaker 1
Do you think there’s something?
00:28:30 Speaker 1
Yeah, about this, where it’s like, like people were more could see the how it was problematic socially because it didn’t do anything for them.
00:28:38 Speaker 2
Personally, I think you’re quite right.
00:28:40 Speaker 2
Yeah, I think that, you know, we are being socialized.
00:28:46 Speaker 2
Um AI is integrated, right?
00:28:48 Speaker 2
Yeah, absolutely.
00:28:50 Speaker 2
I think that may be part of it.
00:28:52 Speaker 2
Yeah, I don’t know.
00:28:52 Speaker 2
I feel like this is a whole side project that one could do.
00:28:55 Speaker 2
Thinking about like tracing the critical conversation.
00:28:57 Speaker 1
And but because it’s so accessible too now, it’s like even like on LinkedIn now it’s like, do you wanna write your little, you know, posting or here have a I do it.
00:29:07 Speaker 1
It’s like it’s not like and there was a time like my son, Jack who’s like very anti AI.
00:29:08 Speaker 2
I know.
00:29:13 Speaker 1
He’s like the robots are going to kill him.
00:29:14
Oh, interesting.
00:29:15 Speaker 1
I know he is like I hate it.
00:29:17 Speaker 1
Nobody.
00:29:17 Speaker 1
She he like we got into.
00:29:18 Speaker 1
This.
00:29:18 Speaker 1
Huge fight a few weeks ago cause I was like saying like.
00:29:21 Speaker 2
He thinks people shouldn’t use it because we’re sort of effectively making.
00:29:24 Speaker 2
It more powerful or feeding it we?
00:29:25 Speaker 1
I’m.
00:29:26 Speaker 1
Yeah.
00:29:26 Speaker 1
And I think also, you know, the environmental cost that it’s like that you know that there are consequences to and that it’s also like.
00:29:28 Speaker 2
Yeah, absolutely, yeah.
00:29:34 Speaker 1
You know, he’s like a real lefty and I think it’s just to him.
00:29:38 Speaker 1
It’s maybe like I don’t know, like a big data thing.
00:29:41 Speaker 1
I’m not like I’m not sure about all of the complexities of his argument, but he’s like, and he just, I think morally thinks he’s like, it’s just bad.
00:29:49 Speaker 1
Yeah, but it is becoming and I have my own thoughts, which I’ve shared on other episodes where I definitely see it as.
00:29:57 Speaker 1
I like to say it’s more nuanced, but in he would say you’re just more centrist or whatever, but it’s like I see there’s good and there’s bad, like there’s implications of everything.
00:30:06 Speaker 1
And I think it’s like there are times where technology, technology always freaks people out.
00:30:11 Speaker 1
People panic and then it just becomes kind of integrated.
00:30:13 Speaker 1
Into our lives.
00:30:14
Life sucks.
00:30:14 Speaker 2
That is true if one looks.
00:30:16 Speaker 2
I’m not an historian of technology, but of course I belong to this broad field.
00:30:20 Speaker 2
I’m a sociologist within this field called Science and Technology studies, that sort of my happier home within social.
00:30:25 Speaker 2
Search and there are many historians of technology that operate in that space and I love their work.
00:30:31 Speaker 2
Yeah.
00:30:32 Speaker 2
And yeah, if one looks historically, of course, there’s all of that.
00:30:35 Speaker 2
Both kind of techno solutionist very positive deterministic rhetoric that, you know, came with every new invention, from Gutenberg’s press to right the telephone router.
00:30:40 Speaker 1
Yeah.
00:30:48 Speaker 2
That was the.
00:30:48 Speaker 1
I’ve house.
00:30:49 Speaker 2
Yes, and but then there’s also the critical, you know, critical social scientists included.
00:30:56 Speaker 2
Um, there’s a there.
00:30:57 Speaker 2
Is that critical determinism, which is it’s all bad or it’s the end of humanity, and the historians do a great job of revealing, through careful historical analysis, how this is in.
00:31:01 Speaker 1
Freight.
00:31:07 Speaker 2
I mean it’s I guess a different version of saying what I said before, it’s a distraction.
00:31:12 Speaker 2
It’s A kind of Presto digitation because it’s often 10.
00:31:14 Speaker 1
Yeah.
00:31:15 Speaker 1
What’s that word you keep saying?
00:31:16 Speaker 1
Can you just spell?
00:31:16 Speaker 2
It’s like light of hand, you know, pressed so not a spell it, but pressed it.
00:31:19 Speaker 1
Oht like, did you?
00:31:21 Speaker 1
OK.
00:31:22 Speaker 1
Digitation.
00:31:24 Speaker 2
Yeah, it’s like moving your hands around like a magician, right, to distract, to distract people from.
00:31:25 Speaker 1
Right.
00:31:26 Speaker 1
Yes.
00:31:27 Speaker 1
Yeah, like sleight of hand, yeah.
00:31:29 Speaker 2
The you know the things that are potentially really going on.
00:31:33 Speaker 2
The truth I mean, which I invoke with caution, but.
00:31:36 Speaker 1
Oh my God.
00:31:36 Speaker 2
And so yeah, I think it’s always yes and right, it’s like um, it’s not one or the other and it’s it’s it’s it’s often both or it’s.
00:31:37 Speaker 1
OK.
00:31:37 Speaker 1
Yes.
00:31:42 Speaker 1
Yes, exactly, exactly.
00:31:48 Speaker 2
Yeah, it’s complex, which makes doing Social Research tricky.
00:31:52 Speaker 2
When I talk to my parents, I think we’ll talk about that.
00:31:54 Speaker 2
You know my very lovely parents who came from very modest means, neither of whom has the university education.
00:31:59 Speaker 2
They’re very bright and definitely have a kind of worldly known.
00:32:02 Speaker 2
Which, but they’re often asking me, like, well, what’s the what’s, you know, just give us the answer or give us the problem.
00:32:09 Speaker 2
And Mom was like, well, let’s systemic, you know.
00:32:12 Speaker 1
Yes.
00:32:13 Speaker 2
But capitalism and it’s, you know.
00:32:13 Speaker 1
Stop, right?
00:32:15 Speaker 1
Yeah, but it’s not just capitalism.
00:32:16 Speaker 2
And.
00:32:16 Speaker 2
So so.
00:32:17 Speaker 1
No, but it’s like human, but it.
00:32:18 Speaker 2
Of course, that just.
00:32:19 Speaker 1
But it’s all of it.
00:32:20 Speaker 1
I mean, and that again is like, why totalitarian?
00:32:24 Speaker 2
Yes.
00:32:24 Speaker 1
I’m to go back.
00:32:24 Speaker 2
That’s so effective.
00:32:24 Speaker 1
It’s like this transcendence of context is that it’s like.
00:32:26 Speaker 2
That’s totally.
00:32:29 Speaker 1
Life.
00:32:29 Speaker 2
In your message for sure.
00:32:30 Speaker 1
You know I.
00:32:31 Speaker 2
And I think, like the robots will kill us is a much easier message than what.
00:32:31 Speaker 1
Was thinking too.
00:32:33 Speaker 1
Right. Or the immigrants will kill us or, you know, or. But I think it’s also why, like Barack, Obama was so much more successful than Hillary Clinton.
00:32:36 Speaker 2
Yes.
00:32:36 Speaker 2
Yeah, yeah, yeah.
00:32:42 Speaker 1
I was talking about this with maybe my husband the other day is like he was really good at keeping things simple like he was really good at being like hope.
00:32:52 Speaker 1
We just need hope and it’s like and Hillary Clinton, who I think you know, there’s all kinds of problems that we could talk about and ways in which is problematic.
00:32:53 Speaker 2
Yeah.
00:33:00 Speaker 1
But she is ******* brilliant, like she is an incredibly intelligent person.
00:33:02 Speaker 2
Yeah.
00:33:05 Speaker 1
And she said, data wonk.
00:33:06 Speaker 1
So she also would be like she would talk about the complexities of things and how we need to understand this nuance.
00:33:10 Speaker 2
Yeah.
00:33:12 Speaker 1
And it’s like nobody cares about your nuance, which is so depressing.
00:33:15
I know.
00:33:15
Quickly.
00:33:16 Speaker 2
It’s still the pacing.
00:33:18 Speaker 2
Well, you know, that’s what my book is kind of about.
00:33:21 Speaker 1
Tell me more.
00:33:21 Speaker 2
OK, so I started the book in 2016, so I’m not only most often the only social scientists or person.
00:33:22 Speaker 1
Tell me more about the book.
00:33:29 Speaker 2
Doing social justice research, I would say I would add the justice in there within the kind of critical AI community.
00:33:37 Speaker 2
But I’m also I happen to study AI in this very particular domain.
00:33:41 Speaker 2
That’s quite understudied globally, which is our use case.
00:33:45 Speaker 2
Sometimes people call it, which is agriculture.
00:33:48 Speaker 1
Right, yeah.
00:33:49 Speaker 2
And like most people do.
00:33:50 Speaker 2
Put those things together and they certainly didn’t in 2016 when I started.
00:33:54 Speaker 1
Yeah.
00:33:55 Speaker 2
And so in 2016, it was like, oh, wait a second.
00:33:57 Speaker 2
I had for a decade, as you said in my bio, I had studied basically public resistance against GMOs and thought very carefully about yeah, Organism.
00:34:04 Speaker 1
Genetically modified organisms, OK, yeah.
00:34:06 Speaker 2
Sorry, yes.
00:34:07 Speaker 2
Um, so I had left the lab bench where I was effectively practicing genetic technique because I found that I was more interested in the kind of public face of the science.
00:34:15 Speaker 2
And like, why are farmers suing these big Agri businesses?
00:34:19 Speaker 2
Why are they?
00:34:20 Speaker 2
Learned, and even though I hadn’t really **** *** lab supervisor Christopher Eckert at Queens University, Um, I felt like when I tried to engage my fellow scientists in those conversations.
00:34:32 Speaker 2
Like, why do you think Members like?
00:34:34 Speaker 2
Why are farmers concerned about these tools and why are people protesting?
00:34:37 Speaker 2
And maybe why should we have labeling?
00:34:39 Speaker 2
Why do some people?
00:34:40 Speaker 2
It was like I had committed a breach of decorum, right?
00:34:42 Speaker 2
It was like.
00:34:43 Speaker 2
Oh, that people just don’t understand the science ohm.
00:34:47 Speaker 1
My God, I just.
00:34:47 Speaker 2
Which is interesting.
00:34:47 Speaker 1
I put a little note that I want to talk about too, and it was again like and we’ll just.
00:34:52 Speaker 1
I just wanna put a pin in this, but I wanna say it out loud and then we can maybe get to it a bit later.
00:34:57 Speaker 1
But I do.
00:34:58 Speaker 1
I was like, just in a chat with some friends about how they’re certain people within the social sciences who suggest that certain among us are, like, not real social scientists because we’re we’re activists, because we are too political or politicized, and that the real, like, like real research is like the ANTIC, which means it’s.
00:35:11 Speaker 2
Cool, that is size.
00:35:18 Speaker 2
Like what? Quantitative?
00:35:20 Speaker 1
I guess I never understand cause I’m like I am the most like one of the most quantitative people in our department.
00:35:25 Speaker 2
Yes, you’re right, it’s true.
00:35:26 Speaker 1
It’s like I, you know, and you have this background in science and it’s like and so it’s.
00:35:28 Speaker 2
Well, you can beat you know.
00:35:32 Speaker 1
I just.
00:35:32 Speaker 1
It’s like it drives me crazy because it’s like, well, that in itself is a political idea.
00:35:38 Speaker 1
But I would love to like hot.
00:35:39 Speaker 2
Absolutely.
00:35:40 Speaker 2
It’s that in itself is a political idea, which is what really had me leave the sciences as I started to think, well, like what?
00:35:46 Speaker 1
Yeah.
00:35:46 Speaker 1
Well, so yeah, tell me.
00:35:48 Speaker 2
What’s that’s in charge?
00:35:49 Speaker 2
Like you know who?
00:35:49
Right.
00:35:51 Speaker 2
Why is this a better way of understanding the world, especially when it comes to questions of, well, power?
00:35:57 Speaker 2
Yeah.
00:35:57 Speaker 2
Also like perception and feelings and values and injustice, yeah.
00:36:00 Speaker 1
Right.
00:36:00 Speaker 1
Yes.
00:36:00 Speaker 1
Yeah.
00:36:01 Speaker 1
Yeah, I’m just like, why is it like what you are trying to do?
00:36:04 Speaker 1
And this also reminds me again of of Ivy Borchert, who like I think that you and she have a like you’re doing totally different areas of research.
00:36:10 Speaker 1
But similar in that you both like are very fancy.
00:36:13 Speaker 1
Both published a lot.
00:36:14 Speaker 1
You both get a lot of big grants and also do a lot of work with like.
00:36:17 Speaker 1
I don’t know.
00:36:18 Speaker 1
Like stakeholders, like with the government, policymakers like you’re very connected to the sort of wider like public and like she was saying, like, she’s not very popular cause she does stuff, unlike gender, health and the profession.
00:36:30 Speaker 1
So she’s often at committees of like doctors, right?
00:36:33 Speaker 1
And she’s the one person she’s like.
00:36:34 Speaker 1
Well, we need to look at it with a gun brands.
00:36:36 Speaker 1
And they’re like.
00:36:38 Speaker 1
Like you know, no we don’t.
00:36:39 Speaker 2
Yeah.
00:36:40 Speaker 1
And it was like, so interesting because she kept pointing out.
00:36:43 Speaker 1
It’s like, you know, anyone can have a generalized, but you need training to do it.
00:36:47 Speaker 1
And it’s so interesting to me that it’s like these scientists are thinking like, well, we know about the genetics of this and so we therefore are the experts at thinking about the social implications.
00:36:53 Speaker 2
Yeah, something just an expert and then.
00:36:57 Speaker 1
It’s like, no, you need a sociologist for that.
00:36:59 Speaker 1
For the historical application, you need a historian for that.
00:37:02 Speaker 1
And yet, you know, so it’s so, anyway, so you were.
00:37:02 Speaker 2
I think so, yeah.
00:37:03 Speaker 2
Uh huh.
00:37:05 Speaker 1
Like I love.
00:37:06 Speaker 2
You see this everywhere.
00:37:07 Speaker 2
You know, there was a I was on a Council of Canadian academies panel years ago, but there was a panel.
00:37:12 Speaker 2
I don’t.
00:37:13 Speaker 2
Maybe it was two years ago on the Funding Agency.
00:37:16 Speaker 2
There is this like CCA.
00:37:18 Speaker 2
You know this like expert panel and it was on answer.
00:37:21 Speaker 2
So, like the Funding Agency for the natural sciences and engineering and it was all about, you know, these are supposed to be a kind of critical assessment of funding and and it was populated this panel entirely by scientists.
00:37:34 Speaker 2
They didn’t.
00:37:34 Speaker 2
Maria, you know, not one person who had as an area of expertise funding, right or like diversity in the sciences or and I just thought that was so telling.
00:37:46 Speaker 2
Absolutely.
00:37:47 Speaker 2
You know, sometimes I used to go to the.
00:37:50 Speaker 2
Canadian Science Policy Conference I always get confused with the government agency CSP anyway CSPC and it was a lot of that, you know, kind of thinking through these kind of the public face, the kinds of questions that I ask about the sciences and technologies, but it was mostly scientists who yeah, think because you’re an expert in this one particular core domain, a physicist.
00:38:11 Speaker 2
Therefore, you can speak right widely about and it’s a real problem.
00:38:15 Speaker 2
So.
00:38:15 Speaker 2
So this kind of rankled me, and in particular I was like.
00:38:18 Speaker 2
And this may be comes back to our earliest conversation and about doing Social Research.
00:38:22 Speaker 2
I kind of think my fundamental job is to just always be thinking about the limitations of thinking my own, but also like in a discipline, including in the right.
00:38:32 Speaker 2
So I’ve published papers where I’m like, come on, sociologist, you think you know what you’re talking about when you talk about X, right?
00:38:38 Speaker 2
In this case, family farming or biotechnology or I did it about data in my?
00:38:42 Speaker 2
Look right and so I hopefully I’m not not taking that lens on my own work as well, but I think that that’s a fundamental part of my job in doing Social Research.
00:38:52 Speaker 2
So I started to have that intuition on the lab bench.
00:38:55 Speaker 2
Like I’m curious about this and know and there was just a a lack of curiosity or in fact like a kind of inferiority.
00:39:02 Speaker 2
And so I left the lab.
00:39:03 Speaker 2
And I went to Saskatchewan, where farmers were suing these big input supply companies, Monsanto being one who were also suing farmers over these past they were patent disputes over these sea technologies.
00:39:09
Hmm.
00:39:17 Speaker 2
And I started to realize that actually the farm.
00:39:20 Speaker 2
There were quite scientifically educated and more interesting to me, was that their concerns weren’t scientific at all.
00:39:30 Speaker 2
So you know they they weren’t anti science, they actually weren’t concerned at all about the science per se.
00:39:36 Speaker 2
They were concerned about the kind of economic impact of this they would call it pollution of these GMO seeds on their organic property.
00:39:45 Speaker 2
Disabling them from selling in European markets, they were concerned about the implications on, I would say, social cohesion and rural communities, so Monsanto was paying farmers to spy on one another and, you know, no longer were farmers meeting at Coffee Row.
00:39:57
Wow.
00:40:00 Speaker 2
It was like, you know, the biotech farmers versus and it was really fracturing these already fractured because of right, like really the corporate control of agriculture, these rural communities that were just gutted, right?
00:40:13 Speaker 2
Farming is an increasingly tough business and economy.
00:40:16 Speaker 2
Increasingly tough economic proposition because of the power that these really powerful companies.
00:40:20 Speaker 2
Yield um and and but the real.
00:40:24 Speaker 2
The thing that really I started to think was so interesting is it became clear to me they were really concerned these farmers about the models of risk of scientific fact, right, that we’re being used by our Canadian.
00:40:40 Speaker 2
I would say legal system regulatory system, Health Canada and the Canadian Food Inspection Agency scientists.
00:40:46 Speaker 2
To study and define risks, they were like these are dumb ways of assessing right.
00:40:51 Speaker 1
Yeah.
00:40:52 Speaker 2
I’d be that the farmers that did more with more eloquence, but right, so they’re not into not to geek out and go down too much into the weeds, but right the models that are used it was substantial equivalence, right.
00:41:02 Speaker 2
So this GM tomato that’s fundamentally altered right at the at the Jeanette Lee level of genes or regulatory elements is just being compared chemically to a shelf like a standard conventionally grown tomato for chemical equivalents.
00:41:17 Speaker 2
Or like the long term health effects, the health effects on in the environment and the potential spread of GM material via pollen drift with like the I would say the kind of ecological right, the kind of Rachel Carson kind of model.
00:41:31 Speaker 2
What are the long term risks?
00:41:33 Speaker 2
Um, that was not so.
00:41:35 Speaker 2
They were really raising, I thought these sophisticated questions about different models of scientific risk, which then had would have an impact on who was able to participate in the regulatory system.
00:41:47 Speaker 2
Right.
00:41:47 Speaker 2
Because if it’s just something to be studied in a lab by a.
00:41:50 Speaker 2
Best.
00:41:51 Speaker 2
And then it’s like this is a sound scientific, right?
00:41:53 Speaker 2
Everything’s fine.
00:41:54 Speaker 2
There we go.
00:41:55 Speaker 2
Sound scientific model is good.
00:41:57 Speaker 2
Yeah, it was really creating boundaries around public participation and that’s when I became the kind of social researcher.
00:42:04 Speaker 2
And today, yeah.
00:42:05 Speaker 1
Yeah.
00:42:05 Speaker 1
And just to clarify, sorry, I was turning, you have just for a second, just so you’re saying that the like the kind of Rachel Carson, larger environmental stuff was not being looked at by debt by the government, but that’s what the farmers were saying, yeah.
00:42:12 Speaker 2
Having looked now and it still isn’t, it’s even got narrower and narrower because now we have gene editing which you know that’s also I think if one were to look a rhetorician or an historian of science like the AI conversation, I feel like the regulatory conversation has followed the.
00:42:28 Speaker 2
The I don’t know the scientific conversation when it comes to genetic technologies.
00:42:33 Speaker 2
Or you could call it the bioeconomy where you know so-called gene editing, or Christlike CRISPR technology.
00:42:40 Speaker 2
Being one, it’s thought to be your presumed to be, because this is the I would say, rhetoric coming from the scientific community precise, right?
00:42:48 Speaker 2
It’s a very precise intervention, medically or scientifically.
00:42:52 Speaker 1
What?
00:42:52 Speaker 1
What is?
00:42:52 Speaker 1
It.
00:42:53 Speaker 1
Is it crispy crisper?
00:42:53 Speaker 2
It’s gene editing.
00:42:54 Speaker 2
CRISPR now.
00:42:56 Speaker 2
What is that all goodness?
00:42:57 Speaker 2
Yeah.
00:42:57 Speaker 2
How do I communicate this publicly?
00:42:58 Speaker 1
You can get into the weeds.
00:42:59 Speaker 1
That’s OK.
00:42:59 Speaker 2
Can I get into the weeds a little bit?
00:43:00 Speaker 1
Yeah, totally.
00:43:01 Speaker 2
That’s well, so recombination is like crew.
00:43:04 Speaker 2
Being a new Organism, right?
00:43:06 Speaker 2
By using methods in a laboratory to move genes from one species, often to another, taking an isolating a gene, moving it with regulatory elements that turn genetic function on and off.
00:43:17 Speaker 2
So for example, like fish genes in tomatoes, because fish, you know, they live through winter.
00:43:25 Speaker 2
In the canal right now, right.
00:43:26 Speaker 2
And they just go.
00:43:27 Speaker 2
Sleep.
00:43:27 Speaker 2
They go into a state of torpor and how to fish, not freeze to death right in the canal that freezes over and we all skate on.
00:43:29 Speaker 1
OK. Yeah.
00:43:34 Speaker 2
Well, they have a higher sugar content and there’s the cellular level.
00:43:37 Speaker 1
Oh my God, I totally could be.
00:43:37 Speaker 2
So clever silence then.
00:43:38 Speaker 1
A fish tissue.
00:43:40 Speaker 2
And we could all just go to for the winter.
00:43:42 Speaker 2
Wouldn’t that be nice?
00:43:43 Speaker 2
A bear would be nicer.
00:43:44 Speaker 2
I would like a dent versus the coal canal, but anyway.
00:43:44 Speaker 1
Rats.
00:43:48 Speaker 2
So so, you know, clever scientists have isolated the gene and the element.
00:43:52 Speaker 2
So to create a higher sugar content to prevent strawberries, for example from freaking chipped in a in a reefer, it’s called a freezer truck, right was a long distance.
00:43:56 Speaker 2
As.
00:43:57 Speaker 1
So you can have a long roof throwing.
00:44:01 Speaker 2
Says.
00:44:01 Speaker 2
Oh yeah, and a longer, maybe longer growing season, but I think it’s mostly for the transportation of of food from because.
00:44:08 Speaker 1
Right.
00:44:10 Speaker 2
Places of growing to all over the world.
00:44:13 Speaker 2
And yeah, so.
00:44:15 Speaker 2
Where are we going with this?
00:44:16 Speaker 2
Oh crispers, so the the CRISPR technology is just even more precise.
00:44:21 Speaker 2
It’s like and edit.
00:44:23 Speaker 2
I mean, that’s a metaphor in and of itself, but and and because it’s A kind of more precise intervention.
00:44:29 Speaker 2
I feel like the public conversation has not followed that the regulation is just assumed.
00:44:35 Speaker 2
Oh, this is more precise.
00:44:36 Speaker 2
Therefore, it has less kind of unintended or knock on effects.
00:44:39 Speaker 1
Oh, it’s like they’ve just streamlined it.
00:44:40 Speaker 2
Um, yes, we’ve just streamlined that.
00:44:41 Speaker 1
This is.
00:44:42 Speaker 1
Inherently better.
00:44:43 Speaker 2
Yeah, I don’t know.
00:44:44 Speaker 2
Again, I haven’t studied this systematically.
00:44:46 Speaker 2
But I feel like that’s kind of happened and in fact that I referenced in my book coming back to the book, this great article by a woman named Ann.
00:44:57 Speaker 2
Morning.
00:44:57 Speaker 2
I think it’s from 2003.
00:44:58 Speaker 2
It’s really old where she talks about scientific racism and my book does not deal with race or racial injustice and technology because I’m not the person to do that.
00:45:09 Speaker 2
But what I really like is the heuristic that she uses in this article where she says, you know.
00:45:14 Speaker 2
We many of us know about the story of, say, the perpetuation of scientific racism through old school sciences, and I put that in quotations like phrenology, right or like what?
00:45:22 Speaker 1
Age.
00:45:25 Speaker 1
Yeah.
00:45:27 Speaker 2
I mean, even eugenics.
00:45:28 Speaker 1
The entire the census, like the entire the entire creation of science, was to justify like racial categories.
00:45:30 Speaker 2
Yeah.
00:45:30 Speaker 2
Since it.
00:45:31 Speaker 2
Yeah, yeah.
00:45:31 Speaker 2
Yeah, exactly.
00:45:33 Speaker 2
Yeah, right.
00:45:34 Speaker 2
Or like it’s a bit one, it’s a bit easier to see or has been easier to.
00:45:39 Speaker 2
Point.
00:45:39 Speaker 1
Yes.
00:45:40 Speaker 2
Out um, but she then talks about the move toward like.
00:45:44 Speaker 2
Biotechnologies as increasing kind of sophistication of the technology, where the critical conversation hasn’t really followed or necessarily been able to follow because the science is so complex and because the tools right are complex, proprietary like and so it’s a bit it kind of goes under it, it becomes insidious, the scientific racism and really like that.
00:46:05 Speaker 1
Oh, totally.
00:46:07 Speaker 2
Heuristic for thinking through.
00:46:10 Speaker 2
What kind of happens when we start to talk about things like big data or AI and again comes back to who’s in charge of the critical conversation or who’s steering that bus, right?
00:46:22 Speaker 2
If the science or the technique is so complex, then it’s easier, I think, for those who sit in positions of power, not just the technologists but the technologists who happen to work for really powerful and very wealthy corporations.
00:46:39 Speaker 2
This to say, ooh, we got this right now and I love the just even the coming back to that point I made earlier about the open letters.
00:46:47 Speaker 2
Right.
00:46:47 Speaker 2
Like the Sam Altman, the open, an open letter or the white, there was like white papers, and I even just love the the language of the open letter.
00:46:56 Speaker 2
The White Paper, like a government white paper, was very much like we’re just going to.
00:47:00 Speaker 2
Tell you about something really important, but like a bunch of it is going to be redacted.
00:47:04 Speaker 2
We’re just going to tell you what you.
00:47:05 Speaker 2
Need to know and nothing more, right? Right.
00:47:07 Speaker 2
So I just think it’s so poignant that they call these open letters or it really shows that a big part of the conversation is closed, necessarily close to who has the key question spread filling open.
00:47:14 Speaker 1
OHP interesting, right?
00:47:16 Speaker 1
Right.
00:47:16 Speaker 1
Like if you need an open letter like that implies, yeah.
00:47:18 Speaker 2
What’s closed?
00:47:19 Speaker 1
What?
00:47:20 Speaker 1
What?
00:47:20
Right.
00:47:20 Speaker 1
What’s happening in the clothes letters?
00:47:21 Speaker 2
There’s a real like, we got this behind closed doors.
00:47:24 Speaker 2
I think approach.
00:47:26 Speaker 2
Hey.
00:47:26 Speaker 1
And you can see how effective that is.
00:47:27 Speaker 1
Well, I was.
00:47:28 Speaker 1
I was just gonna say too, like one person.
00:47:30 Speaker 1
I just signed in my social US Class A while it’s a little bit older now, but rued Benjamin.
00:47:34 Speaker 2
So so great, yeah.
00:47:35 Speaker 1
I think I’ve mentioned her like she does a lot of stuff and I just remember about.
00:47:39 Speaker 1
But racism within coding within technology, all that kind of stuff.
00:47:43 Speaker 1
And I I just remember attended again in class and which is the only time I read.
00:47:48 Speaker 1
I’m sad to admit because it’s like right.
00:47:52 Speaker 2
Really teaching it.
00:47:52 Speaker 1
It’s like I assigned books exactly right.
00:47:53 Speaker 2
Keeps us current exact goodness for students as students.
00:47:56 Speaker 1
Exactly.
00:47:57 Speaker 2
Undergrads, yeah.
00:47:57 Speaker 1
Oh my God, totally gets me to actually like stay more current anyway.
00:48:00 Speaker 1
But she talks a lot about the ways in which, like and again, it’s not that there’s, like an evil person who, like, it’s not that it’s.
00:48:08 Speaker 1
Like you know, some old racist redneck.
00:48:10 Speaker 1
It’s just some dudes in Silicon Valley who aren’t trained to think about these things, you know, and they’re just applying their own.
00:48:11 Speaker 2
No.
00:48:18 Speaker 1
Yeah.
00:48:19 Speaker 1
Perspective their own life experiences and the consequences are enormous for others, but it’s like, and I think people get so caught up in, like, oh, they didn’t mean to or let you know, like, I’ll hear people be like, of course they did really soon.
00:48:30 Speaker 1
And it’s like, well, **** yeah, they’re smart.
00:48:32 Speaker 1
But it doesn’t mean that they know everything about everything, you know.
00:48:35 Speaker 2
And I think it comes back down to, I guess, these fundamental kind of frameworks or presuppositions about what is expertise and right where are the boundaries between science and society, which is the title of my.
00:48:43 Speaker 1
Yes.
00:48:47 Speaker 2
There.
00:48:47 Speaker 2
Yeah, right.
00:48:49 Speaker 2
Because yes, they didn’t mean to, but but they operate as.
00:48:56 Speaker 2
If.
00:48:57 Speaker 2
They’d know everything, right?
00:48:58 Speaker 2
They’re not, they’re not trained to think about ideology like social researcher would be right or about limitations of knowledge.
00:49:05 Speaker 2
And in fact, given the received messaging, yeah, especially for like white men, right?
00:49:11
I’m.
00:49:12 Speaker 2
Not to get into the gender aspect of it, but around technology and right, there’s a there are not really meant to think about or.
00:49:15 Speaker 1
Oh God.
00:49:19 Speaker 2
There’s no checks and balances on.
00:49:20 Speaker 1
How?
00:49:20 Speaker 2
And on their.
00:49:23 Speaker 1
Power. And it’s also like I was saying before we started recording, I was talking about my idea of a practice of humility, which I talk at my book.
00:49:30 Speaker 1
But to me like what drives so much of this is hubris like it is just the hubris of I need to know, I need to be an extraordinary.
00:49:30 Speaker 2
Right.
00:49:34 Speaker 2
Yes.
00:49:36 Speaker 2
Daily.
00:49:38 Speaker 1
And I think that’s partly like what I wanted to make this podcast is like, I want the realness there.
00:49:43 Speaker 1
Like I want people to see like how?
00:49:45 Speaker 1
Heart smart, my friends are.
00:49:46 Speaker 1
How?
00:49:48 Speaker 1
It’s like how, like how much thought and how much work goes into it, but also like I intentionally ask questions of like I don’t get it like it it’s sometimes like my impulse can be I wanna be smart.
00:49:49
No.
00:49:59 Speaker 1
Like I want people to think I’m smart.
00:50:00 Speaker 2
Yeah.
00:50:01 Speaker 1
Even you?
00:50:01 Speaker 2
Review as smart as an authoritative but there’s a politics to that, absolutely.
00:50:03 Speaker 1
Right.
00:50:04 Speaker 1
Yeah.
00:50:05 Speaker 1
And it’s just like and so to me, it’s like I’m trying to like model humility of being like, explain that to me.
00:50:09 Speaker 2
And.
00:50:09 Speaker 2
That.
00:50:10 Speaker 1
I don’t understand this thing.
00:50:12 Speaker 1
I don’t know because it is so hard for so many of us, and I also am working on this idea of like stupid ISM.
00:50:19 Speaker 1
It’s like I think a thing where it.
00:50:19 Speaker 2
I.
00:50:19 Speaker 2
That.
00:50:20 Speaker 1
Like we all are stupid about certain things, but it’s like so undoing emotionally for people to think that we’re stupid about something, that we all pretend to be smarter about all kinds of things and.
00:50:29 Speaker 2
Absolutely.
00:50:31 Speaker 2
That we perpetuate this system.
00:50:33 Speaker 2
This is getting into the unpolished academic I think.
00:50:37 Speaker 2
I think you’re exactly right.
00:50:38 Speaker 2
You know the no one really talks about how the sausage is made.
00:50:42 Speaker 2
The book or an article or a lecture or in the Academy we all perform and then you know, give our students these polished objects.
00:50:47 Speaker 1
Yeah.
00:50:53 Speaker 2
But they’re still such messiness.
00:50:53 Speaker 1
Right.
00:50:54 Speaker 2
Ah, or unpolished behind the scenes and the.
00:50:56 Speaker 1
Yeah.
00:50:58 Speaker 2
Including the emotional stuff, right?
00:51:00 Speaker 1
Totally.
00:51:00 Speaker 2
Foremost, and we don’t model that for our students because it’s not really a safe space for modeling that kind of.
00:51:04 Speaker 1
No, and it’s embarrassing.
00:51:06 Speaker 2
We’re supposed to perform, you know, I have this such bright um, PhD student whose graduated as an A postdoc and recently interviewed for a position that I think she ought to have got.
00:51:18 Speaker 2
And you know the feedback that she got when she after the campus visit was well, you just weren’t confident enough.
00:51:25 Speaker 1
Oh my God.
00:51:26 Speaker 2
And I thought that just I was so depressed for days.
00:51:29 Speaker 2
I just thought, well, you know, that’s basically subtext for.
00:51:33 Speaker 2
You just didn’t perform white masculinity.
00:51:35 Speaker 1
Yes, right.
00:51:36 Speaker 1
It’s like you work too much.
00:51:36 Speaker 2
And so you just weren’t authoritative enough.
00:51:37 Speaker 1
You were a girl.
00:51:39 Speaker 2
You didn’t claim your expertise defined in this particular way, because I think there’s a way to do expertise right to be comfortable, to be an expert is to be comfortable with the.
00:51:51 Speaker 2
Limits of your expertise.
00:51:51 Speaker 1
Yeah, cause you know it and OHS, my God.
00:51:53 Speaker 2
Right, exactly.
00:51:54 Speaker 2
And to say, you know, I do to be comfortable enough to say, I don’t know or look at all of the work that I put into this and look at how I just like had to lay in bed for a whole day and eat cookies because I just couldn’t motivate myself to work on the.
00:52:03 Speaker 1
Ohm.
00:52:03 Speaker 1
My God needs right?
00:52:06 Speaker 2
Anymore or I don’t know.
00:52:08 Speaker 1
Yeah.
00:52:08 Speaker 1
Or I spent all day yesterday instead of preparing for this, like getting into a fight with my husband cause we couldn’t.
00:52:13 Speaker 2
Rained.
00:52:14 Speaker 1
It took us 2 hours to get our kid to school because, you know, like kid of Prof is like I hate school.
00:52:17
Yeah.
00:52:19 Speaker 1
I don’t want to go, you know, and then then it’s like it’s your fault.
00:52:20 Speaker 2
Ohhhhh no.
00:52:22 Speaker 1
No, it’s your fault.
00:52:23 Speaker 1
And like it it just like life lifes, you know, it just happens all the time and I don’t.
00:52:25 Speaker 2
Live.
00:52:26 Speaker 2
Yeah, totally.
00:52:29 Speaker 2
But we.
00:52:29 Speaker 1
Have.
00:52:29 Speaker 2
Perform that in the Academy.
00:52:31 Speaker 1
No.
00:52:31 Speaker 2
So yeah, that’s this like kind of pet project.
00:52:33 Speaker 2
You were gonna talk to me later about it, but I have which?
00:52:35 Speaker 1
Yeah.
00:52:35 Speaker 2
Well, I don’t even know.
00:52:36 Speaker 2
I don’t think it’s a project.
00:52:38 Speaker 2
I don’t think it’s actually Social Research.
00:52:40 Speaker 2
I think it might be a movement, but I feel like it’s important within the Academy because.
00:52:41 Speaker 1
Yeah.
00:52:45 Speaker 2
Is and it comes to back to my, I guess personal history which is, you know, I come to academics as a first generation scholar and, you know, with an immigrant family, my mother’s family and without a lot of opportunity.
00:53:02 Speaker 2
And I remember as an undergrad feeling being marked by that, like on my body, you know, the way that I sat in class and not understanding subtext.
00:53:08
Hmm yeah.
00:53:12 Speaker 2
And in the sciences, it was less tricky, but I didn’t.
00:53:15 Speaker 1
How so?
00:53:16 Speaker 1
What do you mean?
00:53:16 Speaker 2
Well, because it’s a bit more black and white, right?
00:53:19 Speaker 2
You taking a math class and you it’s, but you know I I took all my electives and the arts and I took classics.
00:53:24
Yeah.
00:53:25 Speaker 2
And and and and I had never read the classics right or I mean, I had read some literature, of course, because of high school curriculum.
00:53:36 Speaker 2
But it was just a lot of catching up in not just the reading and the learning.
00:53:42 Speaker 2
The content based learning but also in how to behave right.
00:53:45 Speaker 1
And how to talk?
00:53:46 Speaker 2
How to talk?
00:53:47 Speaker 2
Yeah, like and I and it’s so interesting.
00:53:49 Speaker 2
When I left leg, bench science and I started this degree in sociology of technology.
00:53:55 Speaker 2
Be I got a job as a research assistant doing interviews with actually cattle farmers about climate change, which was interesting because this was like the early oughts and nobody was no cattle farm in Alberta was talking about climate change per se.
00:54:06
Wow.
00:54:07 Speaker 1
Yeah, yeah, brain.
00:54:07 Speaker 2
They were living through it and of course, but which is an interesting sort of side note, but it was the first time I’d ever heard myself recorded.
00:54:15 Speaker 2
And I remember having this weird kind of like uncanny valley or whatever it’s called.
00:54:20 Speaker 2
Like it was me, but it was so not me.
00:54:23 Speaker 2
I was articulate and I just hated it right.
00:54:25 Speaker 1
Hmm, you’re doing a good job.
00:54:25 Speaker 2
I was.
00:54:26 Speaker 2
I was clearly like performing.
00:54:28 Speaker 2
I was doing a good job performing.
00:54:31 Speaker 2
Expert.
00:54:31 Speaker 1
Yeah.
00:54:32 Speaker 2
And so, yeah, I I guess, you know, I’ve started to notice or I started to think early on when I was finishing my PhD and I was pregnant with my son and I was bussing myself all over southwestern Ontario to teach classes on contract and having to, like, pump my breasts on the Greyhound bus.
00:54:49 Speaker 2
And just thinking about.
00:54:50
Yeah.
00:54:52 Speaker 2
The place of I don’t know women in the Academy, the balancing of, as you said before, life, what I might call it, like reproduction and yeah, academic production gender but also class because we’ve kind of stopped in lots of ways talking about class um and yeah.
00:55:09 Speaker 2
So partly the unpolished academic is, I don’t know what, but.
00:55:12 Speaker 2
You.
00:55:13 Speaker 2
Is it a forum?
00:55:14 Speaker 2
Is it a movement?
00:55:15 Speaker 2
I want us just keep talking about and perform modeling the making of the sausage.
00:55:19 Speaker 1
Yeah.
00:55:20 Speaker 2
What you want this podcast to be?
00:55:21 Speaker 1
Yeah, exactly.
00:55:22 Speaker 2
But maybe it’s just this podcast and I can just support your podcast.
00:55:23 Speaker 1
But what I also yeah.
00:55:25 Speaker 2
But you’re not going to rename it the unpolished academic, but.
00:55:27 Speaker 1
No, but that is.
00:55:29 Speaker 1
I’m like I I put in the notes or whatever here.
00:55:31 Speaker 1
I’m like that could be written on my tombstone.
00:55:33 Speaker 1
I feel like and yet.
00:55:34 Speaker 1
What is interesting is I’m the child of two academics like I am the child of people who were like, you know, who were professors like.
00:55:42 Speaker 1
So I my mom was a grad student when I was in elementary school, and my dad was already a professor like.
00:55:47 Speaker 1
And yet I still like and I can see there’s certain things, and it always, like, makes me cringe a bit when I have, like, a first generation grad student and other like I’ve had committee members say things like, well, they they just don’t have the gravitas.
00:56:03 Speaker 1
And it’s like it.
00:56:03 Speaker 2
Yeah.
00:56:04 Speaker 1
It makes me so enraged.
00:56:05 Speaker 1
And I’m like how the like, how do you have the gravity like I have the Grove?
00:56:08 Speaker 1
Has because that’s my habit is to borrow from Bourne Holiday.
00:56:12 Speaker 1
It’s like I was born into it.
00:56:14 Speaker 1
I didn’t work for that.
00:56:15 Speaker 1
I didn’t do anything for that like I just it happens to be I, you know, by whatever role of the dice with the sperm and the egg.
00:56:17 Speaker 2
Yeah.
00:56:24 Speaker 1
I came to be the.
00:56:25 Speaker 1
But like, it isn’t like it’s such a like, it’s so classist and it is so.
00:56:31 Speaker 1
But the other thing I just keep thinking about it like going back to your book and this idea of who is the expert.
00:56:36 Speaker 1
It’s like the farm.
00:56:37 Speaker 1
Others are gonna save, like these organic farmers.
00:56:40 Speaker 1
From what you’re saying, it’s like they’re gonna save us from, like, environmental destruction in some of, you know, not to like overinflate it or whatever.
00:56:44 Speaker 2
That’s OK.
00:56:46 Speaker 1
But they have a kind of expertise and it’s like and.
00:56:48
Absolutely.
00:56:50 Speaker 1
And this is where it’s like for me.
00:56:51 Speaker 1
Humility is not about even being humble, like it’s not about like you know.
00:56:57 Speaker 1
In a Christian way, like hiding your light, people sell, which I.
00:57:00 Speaker 1
You know from the old musical godspell some it’s not about, like, hiding what we’re experts at.
00:57:00 Speaker 2
Yeah, yeah, yeah.
00:57:07 Speaker 2
No, not at all.
00:57:07 Speaker 1
It’s it’s about like owning our expertise.
00:57:10 Speaker 1
It’s like I am an expert at whatever.
00:57:12 Speaker 1
Yes.
00:57:13 Speaker 1
And I I know stuff, but I don’t know everything.
00:57:16 Speaker 1
Yeah.
00:57:16 Speaker 1
And I I have a sociological lens.
00:57:18 Speaker 1
And so I I personally like I was an expert guest on a thing about AI, and I was like, I’m not an.
00:57:20 Speaker 2
Absolutely.
00:57:25 Speaker 1
Like I don’t know anything, but I have so many opinions and it could be so tempting cause we all have thoughts about these this.
00:57:30 Speaker 1
Years, but good research, good being a scholar to me is about well or my friend Maggie.
00:57:30 Speaker 2
Yeah.
00:57:37 Speaker 1
She was talking about, like, conspiracy theorists.
00:57:39 Speaker 1
It’s like this desire to, like, think that you’re really smart.
00:57:42 Speaker 1
And so you create this very convoluted kind of a theory to like explain things and it’s so tempting because it makes you feel like, oh, I have this information that nobody else has.
00:57:52 Speaker 1
I’m smart.
00:57:52 Speaker 1
See how smart I am.
00:57:53 Speaker 1
And it’s like, but what you’re not doing.
00:57:56 Speaker 1
Is your not getting challenged?
00:57:58 Speaker 1
It’s like you’re not questioning yourself.
00:58:00 Speaker 1
You’re not, and so it’s like there’s this balance that we have to engage in of, like, not like, if you don’t speak up about this, if you’re like, oh, I’m not an expert and like, I’m not kind.
00:58:10 Speaker 1
I’m just a girl.
00:58:11 Speaker 1
Like, we’re not gonna get anywhere, but if we’re also like, oh, yeah, I know everything about everything.
00:58:16 Speaker 1
We also were gonna go into really scary direction, you know.
00:58:20 Speaker 2
Yeah, that’s.
00:58:20 Speaker 1
So it’s like this, not rigorous.
00:58:22 Speaker 2
No, no, it’s true.
00:58:24 Speaker 2
It is a balance.
00:58:25 Speaker 2
I mean, it’s like playing within the system.
00:58:27 Speaker 2
And also, yeah, I’m not against claiming expertise or being confident in ones assertions in ones knowledge gained through hard work and systematic thinking and intelligence.
00:58:38
It’s just a.
00:58:42 Speaker 2
But but yeah, I mean I I find it difficult the demands on academics to perform.
00:58:42 Speaker 1
Yeah.
00:58:51 Speaker 2
It’s not that there is an expertise, but that, like objectivity, authority, expertise.
00:58:56 Speaker 2
Look, I mean, we’re kind of told in these very subtle ways, right?
00:59:00 Speaker 2
You said habitus before, but that in order to be read as authoritative or expert, or 1 needs to perform.
00:59:06 Speaker 2
I mean, I really think it’s white masculinity, right?
00:59:09 Speaker 2
Yeah.
00:59:10 Speaker 2
So I think there are different ways to be.
00:59:13 Speaker 2
Yeah, yeah, less polished.
00:59:16 Speaker 2
More honest, maybe more open right to deliberation and other points of view to challenge and yeah.
00:59:28 Speaker 2
Yeah, I don’t know.
00:59:29 Speaker 2
There’s a kind of more.
00:59:31 Speaker 2
Yeah.
00:59:32 Speaker 2
So.
00:59:32 Speaker 1
Yeah, yeah.
00:59:32 Speaker 2
So that’s the unpolished academic and.
00:59:33 Speaker 1
I mean, yeah, I love it.
00:59:35 Speaker 1
And that’s and again it’s like like just to I already said it, but I’m just gonna say it again.
00:59:39 Speaker 1
It’s like going back to the farmers.
00:59:40 Speaker 1
It’s like if we don’t do this, it’s not just like, oh, it’s nice.
00:59:43 Speaker 1
It’s not just like we are allowed to live more authentic lives as individuals.
00:59:47 Speaker 1
It’s also that if we all can be like to me, humility is honesty.
00:59:51 Speaker 1
Like that’s it.
00:59:52 Speaker 1
It’s just.
00:59:52 Speaker 1
And honest, it allows us to then like we can tap into knowledge that that that we would otherwise be too afraid to tap into.
01:00:03 Speaker 1
Like to me this is like it you know.
01:00:03
Yeah.
01:00:05 Speaker 2
Yeah, it’s rigger.
01:00:06 Speaker 1
Yes. Yeah.
01:00:06 Speaker 2
Yeah, in a way.
01:00:07 Speaker 2
So I mean in a part in a way that’s sort of how I started the journey on AI in agriculture is was through first looking at GMO genetically modified organisms.
01:00:17 Speaker 2
And I was really interested in these legal disputes between farmers.
01:00:20 Speaker 2
Person and I became interested in for my masters.
01:00:25 Speaker 2
Why these groups of farmers without any resources were going to bat against these large, powerful like suited up, you know, agribusinesses like Monsanto.
01:00:35 Speaker 2
Why and I and I made sense of it at the time, which is why I found my masters in sociology is a kind of social activism, right?
01:00:42 Speaker 2
There was like nowhere but the courts to gain visibility to the kinds of issues that they were concerned about.
01:00:47 Speaker 2
Again, the non-technical issues, they were almost just using these lawsuits as a as a as a spectacle.
01:00:47 Speaker 1
Interesting.
01:00:54 Speaker 1
Interesting, yeah.
01:00:54 Speaker 2
Um, yeah.
01:00:55 Speaker 2
And and, but then for my PhD, I continued to focus on these lawsuits.
01:01:01 Speaker 2
Partly because the lawsuits continued all their way, all the way up through to the Supreme Court level like this was not going away.
01:01:08 Speaker 2
One of them in particular was highly visible.
01:01:10 Speaker 2
The Percy Schmeiser case.
01:01:11 Speaker 2
In fact, I think there was a movie made out of it.
01:01:13 Speaker 2
Christopher Walken played Percy Schmeiser the farmer, and it was Monsanto versus Schmeiser.
01:01:19 Speaker 2
And so the lawsuits sort of continued and I followed them, but it was also that I became really interested in this lack of humility or this surety, this the way that the legal actors were defining expertise and risk, right.
01:01:33 Speaker 2
Because I saw spending time with the farmers, I knew the kinds of factor that they were submitting to the court and I happened to know the I came to know the farmers through the ethnographic study I’d done for my Masters, and I knew, as you said, that they have a kind of expertise that I would call scientific, right, experiential knowledge of the land.
01:01:52 Speaker 2
They intervene right at science.
01:01:54 Speaker 2
It’s systematic intervention into the natural.
01:01:56 Speaker 2
World and there in the case of the SCHMEISER trial.
01:01:59 Speaker 2
So interesting to me.
01:02:00 Speaker 2
You know, Schmeiser had his neighbor testify that this GM corn actually flew off the back of a truck and landed on Schmeiser’s property.
01:02:08 Speaker 2
Schmeiser had detailed photographic evidence that these GM seeds were like growing on the perimeter of his property.
01:02:15 Speaker 2
And that’s how he explained the so-called.
01:02:18 Speaker 2
Contamination or the the the company said he was growing it illegally, that he hadn’t paid them and he said no, no, no.
01:02:22
Ohhhhh.
01:02:25 Speaker 2
You’ve polluted my conventional seed, and so it was this, like, basically battle over that right?
01:02:27
No.
01:02:31 Speaker 1
Wow.
01:02:31 Speaker 2
And I think there were.
01:02:32 Speaker 2
I ended up knowing he’s actually a colleague at the University of Ottawa now affect reader on that case.
01:02:37 Speaker 2
There were problems with the the facts or his the things he submitted to the court schmeiser.
01:02:42 Speaker 2
But what I found so interesting was that the legal actors, the way they were defining expertise in drawing boundaries.
01:02:48 Speaker 2
So, for example, that testimony, Schmeiser’s neighbor Boyster Meyer.
01:02:52 Speaker 2
Is his name.
01:02:53 Speaker 2
They basically dismissed that.
01:02:54 Speaker 2
And yet Monsanto hired this aeronautic engineer to talk about the kind of average wind speed that a seed would fall.
01:03:02 Speaker 2
And they called that expertise.
01:03:05 Speaker 2
Like they that language that the that the legal reasoner used was expert versus Boyer’s testimony opinion.
01:03:11 Speaker 1
Wow.
01:03:12 Speaker 2
Which, like those kinds of syntactic, you know, like dichotomies and expert versus opinion.
01:03:18 Speaker 2
That’s how I became interested in language and, well, the kind of thinking carefully about or taking a humble position on what we know, right, the limitations of our knowledge.
01:03:28 Speaker 2
And and how that then plays into who can participate in defining our future, the future of food and and that really then I, you know, make my way into data and AI because I started to follow.
01:03:32 Speaker 1
Yeah.
01:03:40 Speaker 2
I started to think about power and language and and these agribusinesses.
01:03:45 Speaker 2
Um and I started to follow their purchasing habits and their investment.
01:03:52 Speaker 2
Like I looking at, you know, financial documents for these big firms and what they were doing with research and development around these seeds because they these are patented objects, these seed systems and in like 2015, it became clear to me.
01:04:05 Speaker 2
They their seed patents were coming up on the major technologies, their Roundup technologies for Monsanto in 2022.
01:04:12 Speaker 2
But it didn’t seem to me the company was investing in in future genetic research.
01:04:16 Speaker 2
To the extent that it was investing in analytics and data companies, they were buying up exactly.
01:04:20
Oh, why?
01:04:22 Speaker 2
So it started out as just this curiosity.
01:04:24 Speaker 2
Like, wait a second is Monsanto an analytics company?
01:04:28 Speaker 2
And like, where do they get their data from and what are they doing with data?
01:04:31 Speaker 2
And then it was, you know, this the public conversation around data.
01:04:34 Speaker 2
And I was thinking, nobody’s paying attention.
01:04:36 Speaker 2
And to companies like Monsanto to Bayer.
01:04:40 Speaker 2
To John Deere.
01:04:41 Speaker 2
Yeah, as data companies and where who, where are they getting data from?
01:04:43 Speaker 1
Yeah.
01:04:46 Speaker 2
What are?
01:04:47 Speaker 2
What are they doing with big data sets?
01:04:47 Speaker 1
We gotta you also said there, but OK.
01:04:50 Speaker 2
Yeah.
01:04:50 Speaker 2
And what and like how do I find out potentially how these data are being monetized?
01:04:56 Speaker 2
They must be part of it.
01:04:57 Speaker 2
I figured a future revenue stream for the companies.
01:04:59 Speaker 2
If there was such investment.
01:05:00 Speaker 2
Made in data and so that’s how the book started.
01:05:05 Speaker 1
So what did you find like so?
01:05:07 Speaker 2
Well, I mean the first thing I found was that it was really difficult to find anything, right, because the data sets that it’s basically John Deere tractors and other tractors too.
01:05:08
No.
01:05:18 Speaker 2
But John Deere holds the majority market and at least North America, and also countries like Brazil.
01:05:25 Speaker 2
And because they hold the Mart, they just dominate the market.
01:05:29 Speaker 2
Hmm, for machinery, now every John Deere tractor is basically a data collecting tractor.
01:05:36 Speaker 2
It’s a, it’s licensed, actually it’s not owned.
01:05:40 Speaker 2
It’s like a cell phone and it has embeds hundreds of sensors that collect data passively from farms, and these data are sent to cloud based infrastructure the company.
01:05:47 Speaker 2
Makes the data proprietary and actually we know that they’ve legally transferred data they do to input supply companies like Monsanto and but so we I knew this but like following the data getting access to data sets the farmers can’t get access.
01:06:01 Speaker 2
I couldn’t, as a critical researcher, gain access to the data sets.
01:06:04 Speaker 1
Yeah.
01:06:05 Speaker 2
I tried to take it interviews but I had published enough.
01:06:08 Speaker 2
OK, on power and GMOs that you know, people within businesses didn’t want to speak to me.
01:06:14 Speaker 2
So I mostly got the stakeholder relations person who gave me like boilerplate, which was actually really fascinating because I came to see similar messaging across the companies from the stakeholder relations, which is a finding in and of itself.
01:06:23 Speaker 1
Right, yeah.
01:06:26 Speaker 2
Um and then, but then often I would get a retired scientist who would say, OK, I can speak to you because I’m no longer subject and nondisclosure.
01:06:34 Speaker 2
So I started to see actually that it was not just copyright law, intellectual property law that was preventing me from following the data and figuring out who.
01:06:43 Speaker 2
Is, you know, gaining economically or other wise from the data, but also trade secrecy law.
01:06:50
Yeah.
01:06:50 Speaker 1
Wow.
01:06:51 Speaker 2
Yeah.
01:06:52 Speaker 2
So it was really hard to tell, but which is a finding in and of itself.
01:06:54 Speaker 2
It was like, oh wait a second, at least in terms of my interaction with regulatory actors, we need to be thinking about transparency and openness and right, I mean, from the very for the basic reason that these companies are collecting the data.
01:07:07 Speaker 2
And then companies like Monsanto run the farm data through proprietary algorithms.
01:07:11 Speaker 2
And then sell advice to farmers.
01:07:13 Speaker 2
Now it’s called precision agriculture about how best to farm.
01:07:18 Speaker 2
If we can’t validate the algorithm, but how do we know it’s actually doing what it says it’s going to do?
01:07:21 Speaker 1
Right.
01:07:23 Speaker 2
And there’s a clear vested interest if you look at I have an article from big data in society that tries to look at what the algorithms do right.
01:07:33 Speaker 2
And if you look, they only recommend chemicals, for example, within the same ecosystem of products from the same company, like how do you from a from a basic public good perspective, we need transparency or at least we need access for critical researchers or those who might validate what these algorithms say they’re doing for farmers.
01:07:38 Speaker 1
Wow, I get it now.
01:07:48
Ohio.
01:07:52 Speaker 1
Oh my God.
01:07:53 Speaker 2
But also from like a broader kind of social justice perspective.
01:07:55 Speaker 1
Ohm, my God.
01:07:56 Speaker 2
So so.
01:07:56 Speaker 2
I mean, I couldn’t.
01:07:57 Speaker 2
I couldn’t figure it out, but then the story I mean, the reason the book is called the Immaculate Conception of data as opposed to like power.
01:08:04 Speaker 2
I think my book proposal was like big data, big power and egg or something, and I saw I saw that like there were ways to infer that these companies are doing things with data that reinforce their already completely inequitable market power, which then leads to lobbying and helps them define all of agriculture, not just for farmers but also consumers and the environment.
01:08:07
Yeah.
01:08:27 Speaker 2
As you said before, the non human environment, but I started to notice in my data set that.
01:08:35 Speaker 2
Something else?
01:08:36 Speaker 2
Something bigger I think even than the whole like monetary power via data, which was that everyone even I found like critical activists, those of us who were in this space and 20/16/17 and just kind of everywhere, every convention I went.
01:08:55 Speaker 2
To there was a way of talking that.
01:08:57 Speaker 2
About.
01:08:58 Speaker 2
And AI.
01:09:00 Speaker 2
That was shared and that was Immaculate.
01:09:04 Speaker 2
It was like basically, you know, using phraseology like data-driven and the AI system knows and AI is smart and feeding and all of the which you might just think, oh, it’s just metaphors.
01:09:15 Speaker 2
Oh, it’s just language, but I started to see that it was a kind of useful Speaking of rhetoric, rhetorical tool for people who are trying to win.
01:09:24 Speaker 2
And right resources, whether it was, you know, government grants or getting farmers to buy things, or convincing a social scientists that this is the future of agriculture.
01:09:37 Speaker 2
So there was this kind of futurism like AI are the future and and the AI tool itself is driving the bus.
01:09:43 Speaker 2
It’s going to get us there inexorably, and it was that way of talking, as if AI or data are immaculately conceived and not a product of human intervention that I started to see as the real finding in the book.
01:09:57 Speaker 2
And obviously then it it’s the framework that gives the book its title and the real problem.
01:10:02 Speaker 2
Coming back to what I see as the main point of doing Social Research because they started to see that it’s a framework that exceeds agriculture.
01:10:09 Speaker 2
It’s everywhere.
01:10:10 Speaker 2
Yeah, people talk about data that way everywhere.
01:10:10 Speaker 1
Oh, tots, yes.
01:10:13 Speaker 2
And and you know, I started to think, well, what happens.
01:10:15 Speaker 2
What are the consequences for the actual unsavory things?
01:10:19 Speaker 2
Or, you know, things that are being done with data.
01:10:23 Speaker 2
Well, if we talk about data as immaculately conceived, then we’re not really alive to right.
01:10:30 Speaker 2
We’re not as alert to sorry the.
01:10:32 Speaker 2
Be power and the politics and the actual decisions being made.
01:10:37 Speaker 1
Yeah.
01:10:37 Speaker 2
By whom?
01:10:38 Speaker 2
Right.
01:10:38 Speaker 2
We can’t ask those precise questions about who collected these data.
01:10:42 Speaker 2
Yeah.
01:10:42 Speaker 2
Who’s monetizing them?
01:10:44 Speaker 1
No.
01:10:44 Speaker 2
Who’s made powerful or disempowered by the use of these data?
01:10:47 Speaker 2
Or this algorithm and it’s.
01:10:49 Speaker 1
Oh, totally no.
01:10:49 Speaker 2
So there was a real politic.
01:10:50 Speaker 2
There’s a politics to that, that framework.
01:10:52 Speaker 1
And it’s.
01:10:54 Speaker 1
Yeah.
01:10:54 Speaker 1
And it’s ohk.
01:10:55 Speaker 1
My God, there’s like so many thoughts I have and like it.
01:10:58 Speaker 1
Also, I love Stewart Hall and Stewart Halls, like theories of like, you know that it.
01:11:00
Yeah, totally.
01:11:03 Speaker 1
Is the most powerful way to sort of control any kind of narrative or belief is when you start having people like experts start to take this in and it is like this framework of like repeating it and believing it and that it’s not like you don’t need the Monsanto to be making this argument when you have 100,000 other people it it becomes common sense like you know that this thing and it also like I was thinking about well one small thing was it was I was chatting about this with my husband this morning after we finally.
01:11:28 Speaker 2
Yeah, absolutely.
01:11:37 Speaker 1
Got our kid to school only 5 minutes.
01:11:39 Speaker 1
Really.
01:11:39 Speaker 1
And I was trying to say like, OK, this was my OHP, my God, it was amazing.
01:11:41 Speaker 2
It’s good.
01:11:43 Speaker 1
It was like but one of the strategies that I used was to use timer and it was like cause this idea I think where I was like OK I’m setting a timer for 15 minutes and then in 15 minutes it’s going to be time to go and they were like oh OK like it it took more than that.
01:11:57 Speaker 1
Like it wasn’t that magical, but I was just saying it’s like, so funny that there’s this kind of, like, objectivity in the data of the like clock.
01:12:04 Speaker 1
You know, in this clock where it’s like.
01:12:05 Speaker 1
You know, I was talking about like sometimes students will be like, well, the PowerPoint slide said whatever.
01:12:10 Speaker 1
And I’m like, yeah, who the **** do you think made that PowerPoint slide like that is me.
01:12:13 Speaker 2
I.
01:12:14 Speaker 1
I made that like that.
01:12:15 Speaker 1
The PowerPoint is me like there’s this sort of like, a really clear rubric.
01:12:18 Speaker 1
Like I often will make really clear rubrics for my students, and it’s like I’m the one.
01:12:22 Speaker 1
If you look at the same grade as if I just give you a.
01:12:24 Speaker 1
Paragraph.
01:12:25 Speaker 1
Strength, but it’s like there’s this idea of, like, well, they can calculate the numbers.
01:12:29 Speaker 1
And this was this percent and that and it’s like, don’t you realize that I actually am, like, futzing with it to make it be the bee that I was going to give you?
01:12:35 Speaker 2
Absolute sing.
01:12:36 Speaker 1
No matter what.
01:12:36 Speaker 2
So that is not interesting.
01:12:37 Speaker 1
You know, but it’s it.
01:12:38 Speaker 2
Absolutely.
01:12:39 Speaker 2
Like absolutely.
01:12:40 Speaker 1
It’s like this idea of objectivity through these kinds of forms.
01:12:43 Speaker 1
Of data.
01:12:44 Speaker 1
That just aren’t real. Yeah.
01:12:46 Speaker 2
So that’s exactly it.
01:12:48 Speaker 2
So in the book in Chapter 5 in particular, I draw on kind of a history of scholarship on we might call it the politics of numbers or particular understandings of objectivity that then map on to write like positivistic frameworks and and numerics and.
01:12:55 Speaker 1
OK.
01:13:06 Speaker 2
And to to try to make sense of the omnipresence, as you say, of this framework, you don’t.
01:13:13 Speaker 2
It’s not just greenwashing or a particular message coming from Monsanto, but that’s in the book I ethnographically trace it from oops corporate ads all the way to the farm right when farmers are saying I’m doing this because this is the future of farming, and it was so interesting.
01:13:15 Speaker 1
No.
01:13:25 Speaker 1
Yeah.
01:13:26 Speaker 2
Farmers would say to me like I would say, well, what have you learned from this field mapping and this AI?
01:13:32 Speaker 2
Oh well, nothing like I I knew that I knew that part of my field was productive.
01:13:37 Speaker 2
I.
01:13:39 Speaker 2
Yeah, but but now I know.
01:13:42 Speaker 2
Yeah, they would say I know it now or you know, but then there was also a way that like the sort of systems for example around crop insurance were being designed with this Immaculate Conception of data.
01:13:44 Speaker 2
Yeah.
01:13:53
I don’t know.
01:13:54 Speaker 2
This presupposition, so farmers right, would submit all sorts of qualitative data or Excel spreadsheet and.
01:14:00 Speaker 2
There’s this one Canadian government egg impact climate reporter.
01:14:04 Speaker 2
And increasingly, farmers had to go through this algorithmic way of knowing right in order to access things like particular insurance provisions and.
01:14:13 Speaker 2
And so yeah, it’s about the omnipresence of that framework and the politics of it.
01:14:17 Speaker 2
So yeah, I tried to draw on, you know, other people in data studies and in a way, the book is very much in conversation.
01:14:23 Speaker 2
With really well known folks like Kate Crawford and Ian Bogost, who had talked about a kind of religiosity toward algorithms.
01:14:31 Speaker 1
Hmm.
01:14:32 Speaker 2
Um, but I found that work very descriptive.
01:14:37 Speaker 2
Like just sort of pointing out, you know, OHS, there’s this religious view people have about AI and and and describing it as a kind of epistemic distortion, almost like, you know, people are not looking at things correctly.
01:14:51 Speaker 2
Three and instead I draw a history of science study scholarship like Teddy Porter’s trust in numbers, which is exactly what you describe about your students and the, you know, numeric calculation and the right this kind of particular interpretation of objectivity, coming from numbers per se.
01:15:08 Speaker 2
Yeah, I draw on that scholarship to try to explain or analyze why, like, why the omnipresence of this of this framework, and in order to like, try to show through a lot of stories.
01:15:13 Speaker 1
Yeah, I don’t. Yeah.
01:15:21 Speaker 2
And again, that kind of ethnographic intervention.
01:15:23 Speaker 2
And how this way of talking about data and algorithms is really useful.
01:15:30 Speaker 2
It’s rhetoric.
01:15:30 Speaker 2
It’s really powerful, right?
01:15:32 Speaker 2
Like if you can do this, I could trace right then.
01:15:36 Speaker 2
Oh, look at this.
01:15:36 Speaker 2
Supercluster got funded, right?
01:15:37 Speaker 1
Yeah.
01:15:38 Speaker 2
And this was their.
01:15:38 Speaker 2
That showed literally this completely like non imperical as a non evidence bag.
01:15:45 Speaker 2
Best graphic in this bid for major government funding between the beginning of agriculture and asymptotic line to AGG 4.0, and that was the history of AI and agriculture.
01:15:58 Speaker 1
Yeah.
01:15:58 Speaker 2
Like what does that mean even?
01:16:00 Speaker 1
Yeah.
01:16:00 Speaker 2
Yeah, but, but if you could make this claim that, you know, we’re going toward this perfect future.
01:16:06 Speaker 2
Yeah, with no environmental impact.
01:16:09 Speaker 1
Yeah.
01:16:09 Speaker 2
No, we’re all farmers.
01:16:11 Speaker 2
Make all the money and consumers are happy and there’s no food waste and there’s traceability and so therefore there’s, like we don’t, there’s no disease and.
01:16:19 Speaker 2
Isn’t this good?
01:16:20 Speaker 2
The Shangrila future and it’s data and AI that are going to get us there if you can, if you can, it’s back to you.
01:16:21
Yes.
01:16:24 Speaker 1
Yeah.
01:16:27 Speaker 2
I’m.
01:16:28 Speaker 2
What did you say?
01:16:29 Speaker 2
The beginning dictate no totalitarians right back to right.
01:16:30 Speaker 1
Oht totalitarian.
01:16:33 Speaker 1
Yeah, yeah, yeah, totally.
01:16:34 Speaker 2
It’s a.
01:16:34 Speaker 2
It’s a great messaging strategy and so that’s what I I tried to explain in the book.
01:16:37 Speaker 1
And it works and I’ll just say because I like to pretend like I’m an expert on everything.
01:16:41 Speaker 1
And so because.
01:16:42 Speaker 2
You don’t. You’re no, you’re bottling humility.
01:16:43 Speaker 2
No.
01:16:44 Speaker 1
But I do I try but when I ripped before summer 4, eighth grade grade, my family moved from Connecticut to Illinois, and I lived in Lake West, Central Illinois, which is very viral farming comma.
01:16:54 Speaker 2
Farming.
01:16:56 Speaker 1
Unity and like we would have to drive an hour and a half to get to an airport, and if you actually go to the Quad Cities.
01:17:03 Speaker 1
So I think it’s Moline is technically, there’s like 4 cities.
01:17:07 Speaker 1
That’s right.
01:17:09 Speaker 1
Along the Mississippi River between Iowa and Illinois.
01:17:12 Speaker 1
And I did my grad degree in Iowa.
01:17:14 Speaker 1
Also, very farming place and if you go to the airport, it’s like all John Deere like there’s like cause I would fly with like my kids and so there’d be like play things that were like these, like foamy things.
01:17:19 Speaker 2
Play.
01:17:24 Speaker 1
And at that Coralville mall, we take our kids there.
01:17:26 Speaker 1
And it was like corn and tractors was like like this is like very rural comma.
01:17:29 Speaker 1
Cities and I remember having this conversation.
01:17:33 Speaker 1
This is a long time ago, but the husband of one of my mom’s best friends is a farmer.
01:17:38 Speaker 1
His name is Ned, and he’s a lovely person and his like has three daughters who are brilliant and all went on to do brilliant things.
01:17:39
OK.
01:17:45 Speaker 1
Um and but he’s, you know, one of the few.
01:17:47 Speaker 1
Like she’s got a big firm and it’s corn and soy like, this is everything there.
01:17:52 Speaker 1
And I remember saying.
01:17:53 Speaker 1
Once in, like the early 2000s, I was like, you know, but isn’t Monsanto bad?
01:17:58 Speaker 1
And he was really defensive and like, like ohf, what, like, are you gonna like in the Commons?
01:18:05 Speaker 1
You’re gonna, like go to.
01:18:06 Speaker 1
Go to, go to somewhere his daughter had been in the Peace Corps.
01:18:09 Speaker 1
I think he’s like, you know, whichever country was never.
01:18:11 Speaker 1
He’s like, go there and see how the Commons is doing for you.
01:18:14 Speaker 1
And he was like, you know, they like it cost something.
01:18:17 Speaker 2
The comments, as in communism?
01:18:18 Speaker 1
Like it’s like, yeah, like as cause he was like basically saying, like, you’re and he’s a progressive, like he’s not conservative. Like, this isn’t like he’s a very intelligent.
01:18:23 Speaker 2
Yeah, yeah, yeah.
01:18:28 Speaker 1
Like thoughtful person, but very, you know, he’s a farmer, like from generations of farmer and one of his daughters actually went like, she’s a farmer.
01:18:36 Speaker 1
She went to Cornell and she does organic farming now, so different from her father.
01:18:40 Speaker 1
But like and, they had a baby named Buck, which I think is the coolest name for overnight.
01:18:44 Speaker 2
Durable.
01:18:44 Speaker 2
So was Ned.
01:18:45 Speaker 1
I know, I think.
01:18:45 Speaker 2
Would you told me about Ned?
01:18:47 Speaker 1
I think we talked about him once before.
01:18:47 Speaker 2
I don’t talk.
01:18:49 Speaker 1
Yeah.
01:18:49 Speaker 1
Yeah, Ned, the pharmacy.
01:18:50 Speaker 1
I’m like, here’s my one farmer story anyway, but I still found it really interesting and especially how you talk and also I follow on TikTok or somewhere there’s this.
01:18:58 Speaker 1
Farmer, who I really enjoy.
01:19:00 Speaker 1
He’s a cow farmer, I think in Iowa, but his whole thing is like trying to show how his cows are treated, how and and exactly but in a very like, look how nice they are.
01:19:08 Speaker 2
Have a sausage just made.
01:19:12 Speaker 1
But so much of it is look at the technology.
01:19:15 Speaker 1
Look at how can so neg’s point was I have a better crop yield because of Monsanto products and I’m gonna pay for that.
01:19:15 Speaker 2
Yeah.
01:19:22 Speaker 1
Like I’m not gonna use like old seeds that like have more disease.
01:19:25 Speaker 1
He’s like, you know, like it was a very pragmatic.
01:19:28 Speaker 1
And I was like, shut down and I didn’t.
01:19:30 Speaker 1
I was like, not enough end of an expert at anything, and I was like, OK, and then I’m gonna go get a drink, but and then?
01:19:36 Speaker 1
But this other guy that I follow on. Tik.
01:19:37 Speaker 1
Tuck.
01:19:38 Speaker 1
It’s like it’s exactly what you’re describing.
01:19:41 Speaker 1
Is this like an?
01:19:42 Speaker 1
Again, it’s not that farmers are ignorant, it’s that they’re actually smart in a certain sense of like, I mean, they are smart.
01:19:48
OHS, yeah.
01:19:50 Speaker 2
Super rational actor.
01:19:52 Speaker 2
Not to laughs.
01:19:52 Speaker 1
Rational choice theory.
01:19:53 Speaker 1
That is the theory.
01:19:54 Speaker 1
And.
01:19:55 Speaker 1
Remember, before rational choice theory so that, but like you can just like everything that you’re saying.
01:19:55 Speaker 2
Yeah.
01:19:56 Speaker 2
OK, got it.
01:20:00 Speaker 1
It’s like, of course, like absolutely, you’re gonna do this unless you have Someone Like You coming along and being, like, can we think about this for a second?
01:20:07 Speaker 2
Of them, I mean it’s very complex.
01:20:09 Speaker 2
First of all, I would lead by saying all farmers are so hard work.
01:20:13 Speaker 1
Yes.
01:20:13 Speaker 2
And like, even though I’m not myself connected to a farm, my grandma grew up on a farm in Saskatchewan.
01:20:19 Speaker 2
But I do see that all farmers, regardless of size and equipment.
01:20:24 Speaker 2
Super, super hard.
01:20:25 Speaker 2
Working thankless job.
01:20:27 Speaker 2
Thank goodness we still have people who are willing to grow food because most of them don’t make a livable wage, right?
01:20:33 Speaker 2
Most farms in Canada are supported by off farm income, which is just to me, like if you’re interested in injustice or environment, you talk about farming.
01:20:40 Speaker 1
Right.
01:20:41 Speaker 1
Oh my God, I have seen have heard looked at.
01:20:42 Speaker 1
Like sent.
01:20:43 Speaker 1
Maybe it was in your book I read.
01:20:45 Speaker 1
Like, but I remember just seeing as the statistics might have been teaching.
01:20:49 Speaker 1
But like the demographic data in the senses of like the percentage of the population in like 1900 that were farmers that that was their occupation.
01:20:53 Speaker 2
In the next two to Agrapha.
01:20:54 Speaker 2
Oh, in farming.
01:20:57 Speaker 1
And then if you look like 1999, it’s like it was like 2% or something in the United States of people say that their career is a farmer, whereas it was like, I don’t know, 65 percent, 70%.
01:21:01 Speaker 2
Yeah.
01:21:06 Speaker 1
It’s like less than one.
01:21:07 Speaker 1
Like that is major social transformation over, you know, and it’s.
01:21:11 Speaker 2
OK, the lively back to the those rural communities and yeah, it’s a.
01:21:15 Speaker 1
Bright and so also like the knowledge collection.
01:21:17 Speaker 1
It’s like you don’t have neighbors anymore.
01:21:18 Speaker 1
Of course, you’re gonna rely on like this company coming and be like, well, we collected data from around the world and we have calculated this algorithm and we can tell you here’s the.
01:21:28 Speaker 1
Best way to do it?
01:21:28 Speaker 2
We did specific to your.
01:21:29 Speaker 1
Right.
01:21:30 Speaker 2
It’s like precision medicine that at least that’s the message.
01:21:33 Speaker 1
Yeah.
01:21:33 Speaker 2
Again, no one has really validated.
01:21:37 Speaker 2
Whether this is a good tool in terms of productivity gain or specifically the environmental impacts cause that’s part of the message is that you know through these kind of precise AI driven interventions, farmers will make more judicious use of things like chemicals and water.
01:21:55 Speaker 2
And heck, I’m not against that.
01:21:56 Speaker 2
That’s.
01:21:57 Speaker 2
Alright.
01:21:57 Speaker 2
So I’m not like I will leave it saying, hey, all farmers are hard working and thank God for farmers, but also I’m not against the technology.
01:22:06 Speaker 2
I’m just.
01:22:07 Speaker 2
Yeah, you know, wanting to look carefully in my Social Research at who controls the technologies.
01:22:16 Speaker 2
Right.
01:22:17 Speaker 2
Ask and then ask precise questions like who stands to gain the most right and and even if farmers gain a little bit is that gain, can it be compared?
01:22:26 Speaker 2
It’s like our platform used versus right, this sort of massive economic gains in the uses and reuses of data by the platform companies, right?
01:22:33 Speaker 1
Yes.
01:22:34 Speaker 2
If you think about the holdings of, say, Facebook Corporation or Google, Apple, Amazon, Facebook, right.
01:22:40 Speaker 2
And so it’s just about, it’s about thinking about power.
01:22:42 Speaker 1
Yeah, it’s like you were just asking what?
01:22:45 Speaker 1
Are we doing?
01:22:45 Speaker 1
Yeah, just like what it is the.
01:22:47 Speaker 2
Yeah.
01:22:47 Speaker 1
The general’s question it’s like, yeah.
01:22:48 Speaker 2
Who’s who’s really benefiting, and should we set some parameters?
01:22:52 Speaker 2
Like should farmers pay for this advice or because they effectively pay with their data?
01:22:57 Speaker 2
Should they get the advice for free?
01:22:58 Speaker 1
Yeah, no kidding.
01:22:59 Speaker 2
Or should we prevent these companies from being able to transfer data among the companies, or to use the data to sell the data sets to insurance or reinsurance companies that literally?
01:23:09 Speaker 2
We stand to profit off of farmer loss should we be able to use these data to predict areas of chemical need and set prices.
01:23:17 Speaker 2
Because these companies are so big in terms of their market share, they’re oligopolies.
01:23:23 Speaker 2
You know, all along the food chain in agriculture, there’s a small handful of companies who said everyone is a John Deere person, right?
01:23:29 Speaker 1
Yeah, yeah.
01:23:31 Speaker 2
And that’s just kidding.
01:23:32 Speaker 1
**** about.
01:23:33 Speaker 1
I don’t know the other.
01:23:33 Speaker 2
Yeah, I think about it one.
01:23:34 Speaker 1
I don’t even know what the other companies.
01:23:35 Speaker 2
Yeah, but there are some small handful of machinery companies.
01:23:39 Speaker 2
Yeah, small handful of companies, you know, four companies now control the entire global market for seeds.
01:23:46 Speaker 1
Wow, that’s unbelievable.
01:23:47 Speaker 2
Unbelievable.
01:23:49 Speaker 2
So you know, what does that do to choice for farmers?
01:23:53 Speaker 2
Yeah, for consumers, what does that mean in terms of like how?
01:23:58 Speaker 2
Decisions are made in Ottawa or Washington.
01:23:58 Speaker 1
Yeah.
01:23:59 Speaker 1
Accountability.
01:24:00 Speaker 1
Like, totally it’s like.
01:24:01 Speaker 2
Yeah.
01:24:02 Speaker 2
Like the lobbying the so it’s it’s really just about power.
01:24:06 Speaker 2
I mean, it’s power all the way down for me, but totally.
01:24:09 Speaker 2
Yeah.
01:24:10 Speaker 2
And so for that farm, for Ned, for Ned, who’s you know, that’s just that part of it too is the messaging, right.
01:24:13 Speaker 1
If.
01:24:13 Speaker 1
These retired now but.
01:24:16 Speaker 2
And and that’s one way that power operates, and it’s a way that’s centrally interests me.
01:24:21 Speaker 2
Obviously I’ve I’ve made that claim.
01:24:22 Speaker 2
I think really clear through this pod cast is the use of rhetoric and language and how power gets set through right message.
01:24:29 Speaker 1
Yeah.
01:24:31 Speaker 2
And so farmers are told in so many ways, but including in corporate advertising, their AG, expert advisor, right, their service provider, government advice that you gotta get big or go home.
01:24:45 Speaker 2
That was the famous language of Earl Butts, the the governor in the US, who was really responsible for the beginning of, I would say, the take off of of Bitcoin and big soy in the US.
01:24:55 Speaker 2
And I have an amazing colleague, Jennifer Clapp, who’s a CRC tier one, who does, who has looked at the history of basically political tools like subsidies in the US and how, you know, US food aid and US policies internationally and also domestic.
01:25:08
Yeah.
01:25:13 Speaker 2
Where used to basically set a particular vision for agriculture.
01:25:17 Speaker 2
We might call it industrial, the kind of Ned, right. Bitcoin. Yeah.
01:25:20 Speaker 2
Big, big soy in order to establish dominance for the US and the post war period.
01:25:25 Speaker 1
Yeah.
01:25:25 Speaker 2
So it’s all wrapped up in Empire building basically, but basically that was the messaging and has been the messaging coming from corporations and a bazillion different ways for farmers.
01:25:28 Speaker 1
Oh my God, yes.
01:25:34 Speaker 2
You know, farming doesn’t, there’s not really money in primary production unless you’re one of the companies selling tractors and seeds.
01:25:41 Speaker 2
And and farmers are told the only way to make money is to get bigger, to buy the newest technology, usually by taking on more debt.
01:25:49 Speaker 2
Right, right.
01:25:49 Speaker 2
And therefore you can make a bit more money.
01:25:52 Speaker 2
You can outcompete your neighbors, right?
01:25:53 Speaker 2
What is that?
01:25:54 Speaker 2
The law of marginal returns you just, you know, tiny little margins, um, and then you buy up your neighboring farmers that go out of business.
01:26:02 Speaker 2
And but it’s.
01:26:03 Speaker 2
Yeah, it’s a tough gig.
01:26:04 Speaker 2
It’s not the way to make real money.
01:26:04 Speaker 1
It’s like monopoly, yeah.
01:26:05 Speaker 2
The way to make real money, there’s an interesting report that I analyze in the book on data in an egg.
01:26:11 Speaker 2
It’s from Goldman Sachs, and they’re basically radically honest.
01:26:15 Speaker 2
I mean, it’s for investors.
01:26:16 Speaker 1
Yeah.
01:26:16 Speaker 2
Sounds.
01:26:17 Speaker 2
Like, of course, they’re honest, right?
01:26:18 Speaker 2
It’s not for critical scientists to look at.
01:26:21 Speaker 2
This was messaging for investors investing in so-called precision agriculture and the Goldman Sachs report says something like in a gold rush sell shovels like you gotta sell.
01:26:31 Speaker 2
The thing that’s collecting the data or you sell the algorithm that’s giving the advice, but you’re not.
01:26:36 Speaker 2
You’re not the person shoveling the gold, oht.
01:26:38 Speaker 1
My God, that is so brilliant.
01:26:40 Speaker 1
That is so, although the evil part of me is like, how can I parlay that into my new brand, but?
01:26:49 Speaker 2
Well, I don’t know.
01:26:49 Speaker 1
But.
01:26:50 Speaker 2
Yeah, you’re in your bid for it becoming an influencer.
01:26:53 Speaker 2
I don’t know.
01:26:53 Speaker 2
Yeah, maybe we need to turn the tools against the powerful.
01:26:57 Speaker 1
Yeah.
01:26:57 Speaker 2
Is there a way to use the tools for?
01:27:00 Speaker 1
Yeah, yeah.
01:27:01 Speaker 1
I don’t know.
01:27:01 Speaker 1
I’ll have to think about it.
01:27:02 Speaker 1
Is we should probably end here and is there anything else you wanna add?
01:27:06 Speaker 1
Just I could talk to you all day and I’m.
01:27:09 Speaker 1
I definitely have to get back to.
01:27:10 Speaker 1
I started reading your book and then got distracted by 100, but I wanna assign it.
01:27:14 Speaker 1
Also, I just wanna say in my.
01:27:15 Speaker 1
Like grad quantitative courses cause like I do, I’m always when I’m teaching statistics well and I’m always trying to explain.
01:27:19 Speaker 2
Call it the September is.
01:27:20 Speaker 2
Yeah.
01:27:22 Speaker 1
It’s like everything is a story.
01:27:24 Speaker 1
I was like what you are trying to do with this.
01:27:25 Speaker 1
It’s like these data aren’t true.
01:27:27 Speaker 1
These data are they’re just a particular form of evidence that allows you to tell a story, and it’s like people don’t.
01:27:35 Speaker 1
And I that to me, because most people aren’t going to end up using statistics.
01:27:38 Speaker 1
Most people are.
01:27:39 Speaker 1
It’s like they’ll run a regression analysis in my class and they’re never going to do it again.
01:27:42 Speaker 1
There’ll be a few people who will, but I want people to understand like the limits of quantitative analysis, data analysis, more than anything else and it’s.
01:27:50 Speaker 2
You’re such a good teacher.
01:27:52 Speaker 2
That’s amazing.
01:27:52 Speaker 1
Try what’s.
01:27:53 Speaker 2
I haven’t thought critically as in like analytically or carefully about how the sausage is made in the sciences.
01:27:58 Speaker 2
Until I started my masters in the sociology of technology with students in the social sciences who had thought so much more about the.
01:28:04 Speaker 2
Sciences than I had as a scientist.
01:28:06 Speaker 1
No, but it’s like, well, I think.
01:28:07 Speaker 2
Also, like so, it’s great you’re giving that to your students.
01:28:09 Speaker 1
Yeah, but it’s hard too, because it’s who valued, you know, it’s like I got jobs because I teach stats.
01:28:16 Speaker 1
It’s like nobody wants to teach it, but everybody values it, and so it’s like I can teach it and you know, but it it doesn’t.
01:28:23 Speaker 1
But I I grow more and more disillusioned with it because I’ll see times.
01:28:28 Speaker 1
There like.
01:28:29 Speaker 1
You know, like my dissertation, which I don’t want anyone to ever look at.
01:28:32 Speaker 1
Which now people will look at, but like it’s like it was.
01:28:35 Speaker 1
So like I started with a proposal and then I got a grant and then I was invested in it.
01:28:40 Speaker 1
By the time I realized that there wasn’t really any story to tell what the data that I had.
01:28:45 Speaker 1
And so I had to make a story.
01:28:46 Speaker 2
Call.
01:28:47 Speaker 2
Yeah, yeah.
01:28:47 Speaker 1
You know, it’s like I had to do it.
01:28:48 Speaker 1
And so it’s like, well, let’s and there’s like, I didn’t do like horrible data mining.
01:28:52 Speaker 1
And I believe in ethical like I do believe that.
01:28:56 Speaker 1
It’s like I believe in letting the data tell the story while also recognizing there’s a person who collected these data.
01:29:02
Yeah.
01:29:02 Speaker 1
There’s or like realize it.
01:29:03 Speaker 2
Or there’s more than one way to.
01:29:04 Speaker 2
Tell this story.
01:29:05 Speaker 2
Right.
01:29:05 Speaker 2
And there’s more than one way to collect the data on any particular variable, which is what you.
01:29:07 Speaker 1
Yes, yes.
01:29:09 Speaker 1
Or that the variables aren’t there?
01:29:11 Speaker 1
It’s like I tried.
01:29:11 Speaker 1
Like I like.
01:29:12 Speaker 1
So I got into qualitative research because nobody was asking the questions that I was asking because nobody thought like breastfeeding and work.
01:29:13 Speaker 2
With this ring, yeah.
01:29:20 Speaker 1
What does that have to do with anything?
01:29:21 Speaker 1
And it’s like any ******* mother who had a baby like you were talking before.
01:29:24 Speaker 1
It’s like pumping your milk on the train it.
01:29:26 Speaker 1
Like, obviously there’s a huge connection between breastfeeding and work, but everything about breastfeeding was just like, oh, this is best.
01:29:32 Speaker 1
This is good for your baby and I’m like, why is nobody thinking about what this is like for the person actually providing this, like, saving elixir of, like world like, you know, domination or whatever.
01:29:37
Yeah.
01:29:41
What?
01:29:43 Speaker 2
You know, it’s so interesting.
01:29:44 Speaker 2
Yeah, absolutely.
01:29:46 Speaker 2
I love that, yeah.
01:29:47 Speaker 2
So in the in the book, I’m quite careful, I think to, you know, to talk not about data bias and cause that stuff doesn’t concern me.
01:29:55 Speaker 1
Yeah. Yes.
01:29:55 Speaker 2
It’s partiality.
01:29:57 Speaker 2
It’s like, you know, on any given variable or curiosity about the world, someone decides.
01:30:03 Speaker 2
Well, first of all, what variables are of interest?
01:30:06 Speaker 2
And then and then how to collect those data?
01:30:09 Speaker 2
How to structure them?
01:30:10 Speaker 2
How to weight them?
01:30:11 Speaker 2
They write the algorithm so there’s human decision making there at every step of the way.
01:30:15 Speaker 2
And then there’s how to tell the story from the data, or from the insight.
01:30:17 Speaker 1
Yeah, yeah.
01:30:20 Speaker 2
Right.
01:30:21 Speaker 2
And so it’s.
01:30:22 Speaker 2
Yeah, just drawing attention to those kinds of things.
01:30:24 Speaker 2
And you’re right.
01:30:24 Speaker 2
That’s what social scientists I think do, especially of the qualitative ilk, made its thinking carefully about also fit for purpose.
01:30:32 Speaker 2
Like 1 project that I’ve done with the Chief Scientist Office with a grad student of mine was designing a tool decision tool really just I mean an Excel spreadsheet that would allow bureaucrats to assess the credibility of evidence coming in for impact assessments.
01:30:52 Speaker 2
These are like large environmental assessments of big development projects, but on social.
01:30:52 Speaker 1
Oh my God.
01:30:56 Speaker 2
Effects because there’s this lack of knowledge about well, first of all, that, like data collected from a public roundtable, are data.
01:31:04 Speaker 2
But also then how do we write?
01:31:06 Speaker 2
What method should we use to collect data on, say, perceptions of cultural artifacts that might be harmed by a development project?
01:31:12
OK.
01:31:13 Speaker 2
Could you qualitative wow.
01:31:13
Could.
01:31:15 Speaker 1
I it and I.
01:31:15 Speaker 2
Could you share with you that and?
01:31:16 Speaker 1
Could share it on the website.
01:31:17 Speaker 2
It’s like all sorts of, you know, just thinking really carefully.
01:31:20 Speaker 2
About in in many cases it ought to be qualitative methods that are used, and then of course, there are ways to judge the rigor or I would say the credibility, the validity of data collected qualitatively.
01:31:33 Speaker 1
Yes.
01:31:33 Speaker 2
It’s not like it’s all opinion or nothing, right?
01:31:37 Speaker 2
You know it’s or.
01:31:38 Speaker 2
It’s all opinions, all relative, but but there’s not a lot of knowledge about.
01:31:42 Speaker 2
How to how to make those assessments well?
01:31:45 Speaker 1
And it’s also, I love it.
01:31:46 Speaker 1
I was like, I wanna see it also for myself, cause there’s part of it where it’s like.
01:31:49 Speaker 1
I I kind of intuitively know how to.
01:31:51 Speaker 1
But how do we do this like it’s?
01:31:51 Speaker 2
It’s like.
01:31:51 Speaker 2
You.
01:31:52 Speaker 2
And that was part of the fun.
01:31:53 Speaker 2
It was like trying to think through exactly systematically, how to assess.
01:31:57 Speaker 1
Yeah, yeah.
01:31:58 Speaker 1
How do I assess these claims?
01:31:58 Speaker 2
I’m regular.
01:32:00 Speaker 1
You know, it’s like, how do I know?
01:32:02 Speaker 1
Cause it’s all I’ll just say like with my kids, it’s like they sent me this one video of this guy that they really like a you tuber and they’re like he collects so much evidence.
01:32:09 Speaker 1
He has so much data, all the stuff and I’m like, yeah, but he’s making this causal claim that it’s it’s it.
01:32:15 Speaker 1
Like there could be 100 other causes.
01:32:17 Speaker 1
It’s like he’s saying well, like.
01:32:19 Speaker 1
Has event a happened?
01:32:21 Speaker 1
And yes, event B happened, but he’s saying B happened because of A and I’m like you can’t.
01:32:24 Speaker 2
I have.
01:32:26 Speaker 1
He doesn’t have, like, that’s the piece that he doesn’t have and it’s just I think those kinds of really it’s like why and like why make the podcast about like the social sciences in the humanities?
01:32:28
Yeah.
01:32:35 Speaker 1
Because I think it also is.
01:32:36 Speaker 1
It’s like the importance of like philosophy and you know, like media studies and rhetoric and all these different fields that I’m talking to people from of like.
01:32:46 Speaker 1
Like research is about thinking.
01:32:49 Speaker 1
It’s like it’s not just about like producing a report with a bunch of numbers on it.
01:32:53 Speaker 2
Yeah, you know.
01:32:54 Speaker 2
Yeah, evidence is comes with in a number of different ways.
01:32:58 Speaker 1
Yeah, yeah, I love it.
01:32:59 Speaker 1
I love it.
01:33:00 Speaker 1
Oh my God.
01:33:00 Speaker 1
And we got it and cause OK forever.
01:33:01
Like a time.
01:33:02 Speaker 1
Thank you, Phyllis.
01:33:03 Speaker 1
Thank you.
01:33:04 Speaker 2
I’ve then a boatload of fun and it’s no clearer so the sky.
01:33:07 Speaker 1
I know no.
01:33:08 Speaker 2
I mean oht.
01:33:08
Oh yeah.
01:33:09 Speaker 2
Our thinking maybe.
01:33:09 Speaker 1
Oh my.
01:33:10 Speaker 1
But.
01:33:10 Speaker 1
It’s white Shelly way worse.
01:33:12 Speaker 1
The window it’s.
01:33:13 Speaker 2
Like Penulis hasn’t turned around in this conversation.
01:33:14 Speaker 1
Sleep ohm.
01:33:15 Speaker 2
It’s a wall.
01:33:17 Speaker 2
My God, it just don’t know on it at home I’m walking.
01:33:18 Speaker 2
We’re gonna see.
01:33:19 Speaker 1
I know I walked oht good.
01:33:21
Me too.
01:33:21 Speaker 1
OK.
01:33:22 Speaker 1
So that’s that’ll be good.
01:33:23 Speaker 1
We’ll be able to see like and maybe it’s, I don’t know.
01:33:26 Speaker 1
I want some like metaphor of like sometimes it’s hard to see the big picture but all we can do is take one one step at a time.
01:33:32 Speaker 1
One day at.
01:33:32 Speaker 1
A time suggests, yeah, 1 foot at a time.
01:33:34 Speaker 1
Know we can only do it.
01:33:35 Speaker 1
We.
01:33:36 Speaker 1
And that’s right.
01:33:36 Speaker 1
That’s right.
01:33:37 Speaker 2
Alright, well, humility and taking one step at a time.
01:33:40 Speaker 1
Exactly.
01:33:40 Speaker 2
I’m gonna hold on to those.
01:33:41 Speaker 1
I love it.
01:33:41 Speaker 1
I.
01:33:42 Speaker 1
It.
01:33:42 Speaker 1
Oh my God, it’s so good thing.
01:33:43 Speaker 1
You so much.
01:33:44 Speaker 1
This is so I’m so glad that you came here and.
01:33:47 Speaker 1
And yeah, if people want to know more about your work, where should I send them?
01:33:51 Speaker 2
To guess my website.
01:33:51 Speaker 1
You everybody’s is.
01:33:52 Speaker 2
It’s woefully out of date, but yeah.
01:33:55 Speaker 1
That’s OK.
01:33:56 Speaker 1
Is it Kelly Bronson?
01:33:58 Speaker 1
What is your website?
01:33:59 Speaker 2
You know what?
01:33:59 Speaker 2
It’s the science and society collective.
01:34:03 Speaker 2
I made a distinct or a A and it on purpose proposal.
01:34:08 Speaker 2
What’s the word I’m looking for?
01:34:09 Speaker 2
Decision to not.
01:34:11 Speaker 2
That call it a lab.
01:34:13 Speaker 2
I don’t want to emulate the wet model, which is a whole other conversation.
01:34:16 Speaker 2
We could have.
01:34:16 Speaker 2
There’s a lot of pressure on me when I got the Canada Research chair to apply for Canadian Foundation for innovation, funding and infrastructure funding and to have a lab, and most people call it a lab.
01:34:25 Speaker 2
And you know the Bronson lab and I didn’t want to do that.
01:34:29 Speaker 2
So.
01:34:29 Speaker 1
Missy called the collective.
01:34:29 Speaker 2
So we’re.
01:34:30 Speaker 1
Oh my God, I love that so much.
01:34:32 Speaker 1
Oh my God, I love it.
01:34:32 Speaker 2
It’s a bit where well, but that’s OK.
01:34:33 Speaker 1
It it but, but that’s what science should be like to me.
01:34:36 Speaker 1
Science shouldn’t be allowed.
01:34:37 Speaker 1
It should be a collective.
01:34:38 Speaker 1
It’s a collective enterprise.
01:34:38 Speaker 2
And it is like I would be nothing without all the people around me.
01:34:39 Speaker 1
It’s about right.
01:34:41 Speaker 1
I love it.
01:34:42 Speaker 1
I love it.
01:34:42 Speaker 1
OK.
01:34:43 Speaker 1
Yeah, to me.
01:34:44 Speaker 1
Yeah, that’s not.
01:34:44 Speaker 1
Like communism, that is.
01:34:47 Speaker 1
That’s that’s good science anyway.
01:34:47 Speaker 2
It’s.
01:34:49 Speaker 1
Alright.
01:34:49 Speaker 2
And that that.
01:34:50 Speaker 2
Yeah, my approach to the Academy, but that’s OK.
01:34:51
Fair enough.
01:34:53 Speaker 1
They’re enough.
01:34:54 Speaker 1
It’s the best part of it.
01:34:55 Speaker 1
Anyway.
01:34:56 Speaker 1
Well, thank you, Kelly, and thank you to those of you listening to the doing Social Research podcast.
01:35:02 Speaker 1
If you enjoyed this episode, please take a moment to give us a rating on your favorite podcast platform and share what you liked about it.
01:35:09 Speaker 1
And if you did not enjoy this episode, no one wants to hear from.
01:35:13 Speaker 1
This this will really help us to reach more listeners and make doing Social Research within the reach of everyone.
01:35:19 Speaker 1
I’d love to connect with you.
01:35:20 Speaker 1
You can follow me on like all the social media things except what’s the one I don’t use Snapchat.
01:35:26 Speaker 1
I’m too old for that, but Instagram, Twitter and I wanna get Sky blue is like the new one.
01:35:32 Speaker 1
I hate X and I hate Elon Musk well.
01:35:35 Speaker 1
I’m like, yeah.
01:35:36 Speaker 1
So anyway, I need to find you anyway.
01:35:37 Speaker 1
I’m at socio Mama.
01:35:39 Speaker 1
I also have a Facebook group which do the doing Social Research.
01:35:43 Speaker 1
Facebook group to keep the conversation going a little last count there was like 4 people joined.
01:35:47 Speaker 1
So thank you to my three friends because one of those.
01:35:50 Speaker 1
Me.
01:35:52 Speaker 2
I will join you. Yep.
01:35:53 Speaker 1
Thank you.
01:35:54 Speaker 1
So links to all my social media to references mentioned in today’s episode will be in the show notes.
01:36:00 Speaker 1
If you have a question about Social Research you’d like me to tackle on the podcast or make a post about on the website, you can send me an e-mail at [email protected] or message me through the social media.
01:36:12 Speaker 1
LinkedIn is another one.
01:36:14 Speaker 1
And don’t forget to check out the website doingsocialresearch.com.
01:36:17 Speaker 1
It’s still like kind of a mess in a work in progress, but I am hoping to keep it tidy tidied up and I always want to add more information to it.
01:36:25 Speaker 1
Special thanks to our sound editor Willow Ruby Young for making a sound amazing.
01:36:29 Speaker 1
Jonathan Boyle is a real life person who I paid money to.
01:36:34 Speaker 1
I don’t know him at all to write her theme song like a *** ***.
01:36:36 Speaker 1
And my favorite joke is to say.
01:36:38 Speaker 1
I know you can just like go online.
01:36:40 Speaker 1
It was like 100 bucks to get the rights to his song and so cause some people use like AI to make music.
01:36:45 Speaker 1
And I was like, I’m gonna pay a musician anyway.
01:36:47
OK.
01:36:47 Speaker 1
He was awesome.
01:36:48 Speaker 1
Like again, I have no idea who he is, but I love that the song is like a ******.
01:36:52 Speaker 1
Because I think, like we’re all ******** who are out there trying to do the Social Research in the face of like, you know, Monsanto or whatever.
01:36:59 Speaker 1
Anyway, I’m philanthropy and this has been the doing Social Research podcast.
01:36:59
The.
01:37:03 Speaker 1
I always remember keep keeping it real and keep doing Social Research.
01:37:06 Speaker 1
Bye. Hi.
01:37:08 Speaker 1
And.
01:37:14
No.