Episode 86 – Spotting Tech BS

On this week’s show we look back on Malcolm Gladwell’s Outliers 

Next up in the WB-40 Bookclub is Simon Sinek’s Start With Why

We also have a fabulous interview with Society Inside’s Hilary Sutcliffe.

Don’t forget you’ll be able to hear our new experiment in Podcasting Powered Projects from 10am on Wednesday at https://wb40podcast.com/the-flexible-movement

—- This transcript is generated by http://otter.ai.  It is provided to help with searching our site and may be of little or no use otherwise! —

0:21
Hello and welcome to Episode 86 of web 40 the weekly podcast with me Matt Ballantine and Chris Weston

0:28
We are at Episode Number 86 which is a fine number to have reached and it’s been an interesting week in the world of technology and business what have you been up to?

0:38
Well I think we should first of all say for the eagle eyed members of our audience you may be noticing that this is the show is a day late and I’m reminded of a story from my first ever boss and I worked at KPMG back in the early 1990s and my first boss check with Neil black spell Neil funny and he came into the office one morning Monday morning and he had a perfect circle red mark on his forehead and we spent the first part of the morning wondering about whether we should ask him what this perfectly circular red mark on his forehead was

1:17
and eventually somebody showed endeavors

1:19
will hell well eventually got to was we had a weekly team meeting on Monday about 11 o’clock and my colleague I think her name was Afro at the end of the meeting said Neil was that mark on your for it and he let’s listen he said do you want the truth or you’re into making up I mean one of the truth please and he says he said well at the weekend I bought a new sink plunger

1:46
excellent sorry was pretending to be a diabetic

1:50
you can toss is the wrong way round because the soccer thing as he pointed out

1:54
once but it’s a perfectly reasonable thing to do having bought a new sink plunger think it’s almost delegate to stick it to you tend to be and if the collateral damage is a round circle on your head

2:09
to work on a Monday so be it I applaud search for exactly so anyway

2:15
that’s why why that is right we could say is because we were incredibly busy and we’ve got a new show new thing launching tomorrow and said there was good but no the reason why this show is a data is because we record it all yesterday and then when I went to go to do the Edit I found that there was something wrong with the sound card settings and spent the whole thing was basically unusable unless funnily enough he wanted to listen to Chris sounding pretty much psychedelic so I’m there we go But apart from that it’s been

2:44
it’s been a good week so far I wouldn’t have wanted to start new under the books like that I’d rather just applied on regardless and we could we could do you didn’t have to

2:54
now I think honesty is generally the best policy no always but generally the best policy It has been an interesting week actually in the 24 hours since we last met it looks like we’re going to now at last enter into complete shit storm phase of the ever decreasing circles of breaks it with them it’s apparently as we record this people popping in and outside number tend to be able to start to get angry about things so that’s going to be entertaining but in Britain in better news is it’s been confirmed that for Rs is under investigation by the Russia inquiry being led by mother in in the United States so you know so when when

3:36
you know about is good and they have got the got the chair there so hope springs eternal I am I saw the Prime Minister talking about end game I think there’s the chest term I would describe her home the government has in his dog slang, which is one that I think they’ve been for some time. And the splendid thing about so exciting is it it’s just term that means that you are you’re essentially in a perfectly reasonable position. If you can’t have all the pieces

4:07
and you’re not in a in a you’re not being attacked and anywhere so casual observer would look at a board and say Yeah, that looks evenly balanced. But actually any move you make puts you in a worse position and it’s it’s a pretty horrible place to be. And it’s kind of a place you get yourself and that’s what I would call that’s what I would call their situation at the moment. So when she’s got to make a move she’s got to say one of her pieces and move it and whichever one she does, she’s going to end up with a problem I I’m interested to see which one it is and I think tomorrow tomorrow could could be one of those exciting days were all sorts of people resign the usual suspects plants out of government saying they’ve been terribly that Dan when actually it’s all their fault ha ha

4:54
ha ha I will be investments to not actually doing anything important but I will be in the area seven should say the way I judge it is how many camera crews there are on college green dark recently you don’t wear be a square foot of grass available in college degree it’s mine isn’t what is whether you’ve got the three levels or four levels no cameras at all just a couple lot or marquees we possibly into marquee territory tomorrow. College green is where they shoot all of the footage, if you don’t know is the one with the nice view of them as upon them in the background. Anyway, on with the show. We’ve got an interesting I think and action packed show this week, you and I are gonna have a little chat about the book club book, which was Malcolm Gladwell outlines, we’ve got an interview with a fascinating lady called Henry Sutcliffe who runs an organization called society inside and we’ve got a sneak preview of a new project that is going to be available in just about 12 hours time from my my recording this. So um, let’s get a on with the show

6:04
two weeks ago, two episodes ago. We chose Malcolm Gladwell outliers as a book club pick. And it’s something that has been around for a little while. It’s not a it’s not a new book, by no means a bit of a classic, I suppose. And, but it’s one I hadn’t read. And but I’m quite glad that I did. Because it was a really very readable book, he’s got a very good anecdotal style, but also there’s a lot of detail, you know, you can see it as research. And I think for me, is that those descriptions of how people end up achieving what they achieved, or making mistakes, sometimes that that, that maybe you wouldn’t expect them to make, it’s very rarely to and to look and it’s very rarely down to a sort of blind

6:52
faith or, or just hard work and all of those kind of things that sometimes thing people ascribe to with having the get up and go spirit. And if you can do it, anybody can do it. It’s it isn’t the case. And it’s a bit like Gary Player used to say, you know, the more I practice the luckier I get. And that’s kind of I mean, one of the things I hadn’t realized is that it was the book where the 10,000 hours makes you into an expert kind of claim came from,

7:21
which, again, is kind of and there’s not much science behind. It’s kind of anecdotal, and it’s something I think there’s been a little bit deeper sense, but, but there’s a lot of truth in it nonetheless,

7:31
yeah, he, I think first and foremost, Gladwell is a superb author, a superb storyteller. And if we look at the books that we’ve had over the last six months, I’ll say that we’d be doing this book club thing, I think, actually, of all of them, probably he and damping are the two that I would tell you the most engaging writers.

7:52
And yeah, the the core of the book is what we learn from looking at people who are exceptional performers. And what we learned is that it’s not because of natural ability alone. It’s not just because of hard work is a better circumstance,

8:08
I

8:08
guess I don’t, I only went through the first three chapters, again, is a book I had, it’s so old, I have it in hardback, which is actually given a book these days, I never get the paper version.

8:21
But I had forgotten about the 10,000 hour thing. The one thing I don’t remember from skimming again, is whether it looks at the the concept of survivorship bias, which is always a danger with the stories looking at why people are successful. Because actually, what you do is you draw conclusions from a limited subset, but you don’t look at the people who weren’t successful to be able to validate that. And actually, one of the reasons why anything is successful is the paradox is because lots of other things weren’t. And it’s not because it was successful. It was actually more importantly, because lots of other things weren’t, etc, different successful startups reason successful startups, as you know, where the startup ecosystem works is because most startups fail. And so it’s the ones that rise out by exception and trying to draw lessons from why they rise out as a fool’s game. Because actually, you might will find that there were loads to fail the had exactly the same situation except for tiny little elements of, of luck within there.

9:23
But I think, you know, it’s, it’s a it was, it was nice to read some of his stuff again, I’ve not read much Gladwell for wireless into his podcast, which is well worth checking out. He’s actually just launching a new podcast with him and his mate, Rick Rubin, the guy who was one of the co founders of the Def Jam record label, taking a kind of glad William look at the world of music, which I’m really looking forward to, because that brings together kind of two things I really like.

9:51
And yeah, it’s it. It’s one of those books where you kind of you remember the anecdotes, I remember when I was working, doing management training, I used to love Gladwell books, because they would furnish you with a bunch of anecdotes that you could use to illustrate things you were trying to get across to people. And that the whole essence of storytelling being a really valuable thing, I think, read somebody like Gladwell, and you understand why

10:13
Yeah, that’s true. And, and also, the powerful messages are about really about human nature, whether it’s about people who are successful, because, and there’s one there’s, there’s a, he talks about practical intelligence. So it talks about the fact that people can be very clever, and then they can be very, very, very clever. And you can use a manner of IQ measurements to show that they’re very, very clever, and they aren’t necessarily successful because being being right isn’t the same. Same as being successful in that sense, you can be as clever and as as fundamentally correct as you as you like, and you can have all sorts of theories that turned out to be correct. But if you can’t convince other people have their veracity, then you’re probably not going to get very far and the kind of practical intelligence part that that explains how people can take high intelligence, but then apply it and be more successful. And the kind of there’s a part where he talks about an airplane or some airplane crashes were essentially it was a bad lack of communication in the cockpit and difference between different

11:25
co pilots and pilots. And the fact that these were cultural rather than, you know, if, if if the world was it was an affair place and everything was a meritocracy. And people were judged on what they did. These things wouldn’t happen. But because

11:43
I think one of the examples was a Korean Air they had a lot of lot of issues in a certain period of time. And it was because one of the things escalated to was because of this issue of deference in the cockpit where Okay, problem wasn’t really allowed. Or it would have been unthinkable for copilot to challenge the captain on the decision. And it was

12:03
Qantas wasn’t it? So that one is I didn’t reread that bad forgot about that. The the Qantas pilots were in a strength Bruce, what is he doing there while you’re about to fly into the side of the mountain isn’t something that a co pilot would have any worries about saying, whereas in other cultures the difference because of perceived position? And seniority meant that they wouldn’t go about doing it? And that

12:24
yes, now is there. And it’s that kind of anecdote, isn’t it? That actually it’s about being able to

12:30
either it’s funny, I’ve been at an event all day to day where run there were some people talking about the use of data in the world of marketing. And there’s a bit of particularly road me, which was the because we’re using data, we’re taking subjectivity out of things, and oh, man, you don’t understand this stuff.

12:49
We will cover the wisdom we recovered the truth that last

12:53
How do you get the data Oh, we did it three surveys.

12:57
But that actually that the point about how we are in a world where this comes back to it, you know, a lot of academics hate Gladwell because he’s successful and because he’s able to communicate and that the it seen that actually, you know, academic texts almost should be really held to consume and that something that is just justified by anecdotes and story has no value because it’s not backed up by data will know the date data isn’t necessarily just numbers. data isn’t any more objective than a good story. And I think we get into these i don’t know i i think the world needs a lot more good storytellers like Malcolm Gladwell is the bottom line for me.

13:37
Yeah, that’s true. I think, I think the whole practical intelligence pieces either him consciously or subconsciously kind of saying that because he’s got the ability to make use of the information that he gathers. And the same way that Johnny Vaughters on us when we were young.

13:54
There are lots of no doubt and far more qualified math teachers who wouldn’t be as good

13:58
Yeah, Johnny, or an amazing land on amazing man. Anyway, so that’s math. And Beth was outline as well worth the read. If you haven’t read it well worth reread. If you’ve already read it in the past, and the the book that will be taking us through the next two weeks, we seem to be going on a run of titles that have been around for quite some time, because the one that we’re going to be doing next is Simon Simon. cynics book. Start with why. So we’ll put a link to that on the web page and be willing to outliers there. And I might even get round to putting a link to glad world’s new music podcast. But obviously, you know, listen to that after this is done before.

14:43
So we’re going to continue our run of having some excellent interviews because you talk to Hillary Sutcliffe of society inside and this is a nonprofit company. I think we and they she was talking about how to spot technology

14:58
bullshit, is that right? Yeah, pretty much. So as you’ll hear more society insights of thing that Hillary set up few years ago, it’s a sort of, I guess, Lucy, you’d call it a think tank. And it looks at being able to allow people who aren’t technical specialists in a particular area, to have more of a say, in thinking about how we should be setting direction and policy for evolving technology is not just digital channels as well. So let’s hear how it

15:30
might seem receptive. I run a nonprofit called society inside. So this is a sort of riff on Intel Inside but that society’s needs people and planets needs are at the heart of innovation, not just some scientific curiosity need to make money. So this really came from and looking at the impact of technology starting in gaming about I started in 1994 with Shell Brent spa, and then became very folding so corporate responsibility area, and then it morphed into tech it became all about nanotech. I work in biotech. I did recently I think, in quantum tech, also AI, robotics and various tech. So it’s the cross text learnings for house the scientists needs and wants can be really put it in the heart of technology development.

16:19
And as a general principle at the moment. Well,

16:24
I mean, obviously, you see this as an issue, because otherwise you wouldn’t be doing the work that you do. But is it a Is it a consistent gap across different technology field? Is it about emerging technology more than established technology? Is it what are the what are the gaps that you see at the moment in terms of how people’s needs are being represented by technologists as

16:47
well, it’s funny, having now gone from maybe five or six texted in an Internet of Things thing the other day, what’s really becoming quite obvious to me and actually throw this shocking to me is that you know, people and human nature means that we never really learned the lessons. So right from the very beginning of early GMOs, the tech people are really excited about the tech. And really, obviously, they don’t want to hear negative things from people, particularly those NGOs are others who opinions they don’t even record in the first place and, and the NGOs. They’re very skeptical. So yeah, well, we’ve heard that about tech. Yeah, well, don’t, you know, we, you know, let’s, let’s just wait and see. And we find, then the businesses really trying to, you know, make money where they can try to see the market where they can, and the policymakers a bit of a rabbit in the headlights thinking, Oh, my God, what are we got to do with this, you can not, you can genuinely swap the tech name. And I have the same conversations with the same word tech, by tech by tech over the years. So that’s a really interesting thing is how do we, you know, engage people at all those different than nodes in the innovation chain, if you like to think in a more progressive and a more rounded way. And that’s really what I spend my time doing, talking to different stakeholders, thinking about, you know, how do we have this starting from a problem that needs to be solved rather than this perennial tech looking for a home problem that we see time and time again,

18:26
it’s also sometimes maybe the just because you can do something doesn’t mean you necessary should do something.

18:35
Yes, I mean, this is very tricky, because a lot of that is asking people not to develop a product or technology for around that obscure science, you know, sociological issue that doesn’t really feel like it resonates to them. And, and certainly this idea of cumulative negative impact on society. That’s something that it’s very hard to see from the beginning or from the inside, I read a great story this morning about a really very detrimental technology that’s shortening our lives that’s dramatically increasing in ill health, and changing the physical shape of humanity called the chair. And suddenly you realize, you know, you were chair manufacturer, you’re not going to make any chairs, except our, for the last sort of, at hundred years, sitting down as causes dramatic problems. So we see this in terms of the internet. And, you know, but that’s a lot to ask a chair manufacturer, not to make a chair. So, you know, this is part of this very big picture discussion about, particularly about the internet, about artificial intelligence, and these digital tech, what are they doing to us, is a very complex picture, when you’re talking, you know, a startup making something quite interesting that they think is a great idea

20:00
sorts of organizations that you do this work with any sort of, broadly about NGOs,

20:06
who is at the you’re trying to be able to, to, to collaborate with,

20:11
well, we’re trying to bridge really companies, scientists, NGOs, and policy people. Because when we look at it, the policy people, if you’ve got like a chain of innovation, the policy people decide where the initial big science funding goes. So they may have challenges, they may have technologies we’ve got in the UK, the eight great technologies, and they are deciding, and there’s a great quote from Robin Crosby, which is texts, not bad, it’s not good. But it’s not neutral, either. So you are making a political choice, fine, choosing one tech one solution over another. And by choosing a technology solution, perhaps over a systemic solution, or a human solution. So at that very moment, you’re starting to make choices. So we engage at that level, then we talk to people, particularly in the research comes because a lot of the work we’re doing is quite cutting edge tech, which starts in the lab. And it starts in science in a university, but lends itself has got a whole layer of sort of political decisions that are made at that point, how to fund who to fund what questions those people are going to have an answer for you before they get the money, and that’s another home area. And then there’s business r, amp D. So how are they then using their r&d, not just for a very narrow money focus, but actually to try now think about the impact of the responsibility of the R, amp D. And, you know, having done 1520 years in CSR, the r&d department is a black box and CSR and it’s only just starting to open up in all sorts of different areas. So there’s, those are that and then of course, civil society

22:00
can make or break your technology, individual people can make or break a whole area. And there is a sort of Democratic issue there of show we let three NGOs rule the world. But then again, they’re asking very sensible questions that we as people perhaps want to be asked. So that’s another area that all of these little areas needs to be bridged and need to discuss, and as soon as possible. And that’s what this idea of anticipating technology, anticipating governance is all about trying to think about it in advance before it all goes bit question, do you think there’s a gap

22:36
at senior levels, both, if you think about on the policy side, and political side, and also in business as well, to the people who are running these organizations actually have the experience and the knowledge to be able to make valuable value decisions on these kind of issues?

22:56
I think it’s difficult, I think you see some brilliant people who do those, I think you see some people who haven’t got a clue. They’ve got obviously very different pressures. One of the work that we’ve been doing for a number of years is to have the investment community asked CEOs, these more thoughtful, more new and questions about the externalities of their product. As soon as an investor with money asked that question, the whole conversation changes, but they don’t ask. So it is quite difficult to to have organizations and CEOs think that when they haven’t got that direct incentive and pressure to do it, but we see that pressure grow. Now we see it growing from investors like Larry Fink asking that question of technology companies. We see it from politicians we see from companies, I mean, a lot of the discussion recently about Google at not going forward in certain areas, because their employees won’t do that I was at a biotech meeting in New York the other day and their employees to say, No, we don’t want to do that we want to do this. So I think companies are getting pretty from all sorts of different angles looking at do we want to do this thing? Or how do we want to deliver this product area? And how do we want to communicate with society and in a way that I don’t think they’ve ever had. And I think a lot of organizations finding that really tricky and the

24:17
sorts of technologies that you’re talking about and i and i understand quite a lot around the information technology digital data kind of world but but the work you’re doing with society inside is much broader than that but what are the sorts of emerging texts that you’re looking at that maybe wouldn’t be familiar to me or

24:39
something? Well, I think know tech is an interesting case study in a way so that came around in the UK in about 2004 in the us a little bit later Europe but about the same time as an ology and they wanted to genuinely learn from GMOs, I wouldn’t have a job with GMOs, it done a decent job, I wouldn’t find a job at all, and neither will the academic to the area and they thinking, Well, what do we need to do to actually have a technology that society can really get behind, we developed a multi stakeholder initiative called the responsible manner code, which was trying to look particularly at companies, but also scientists, what do we have to do, and the lesson for them is, you know, it’s got to be a shared idea of benefit, you know, society has to agree that this is a benefit that we all quite want. And we see how AI somehow sometimes fails to do that, how biotech sometimes fails to do that. And then you have to look after the extra and honestly, so you have to look after the the negative impacts and until itself is tricky, particularly when you’re really excited about your tech, you know, the quantum tech doll, or just recently, they just can’t see that there are going to be any No, no, don’t worry about the government, we’re absolutely fine. And we see this in digital, you know, even with basically welcome and Twitter are saying, We need some governance, please give us some after 10 years ago now. And you know, so we see these lessons again, similarly, biotech, you know, we’ve got GMOs, which are all about the insertion of foreign genes. Now we’ve got gene editing, which is about editing and cut and pasting the genome you would hope given it’s virtually the same technology they would be learning from GMOs but no different people no no no it’s going to save the world so this idea that hype is a big big problem I see it time and time again you know nuclear was going to give us energy too cheap to meter biotech it was going to solve the world’s problems and the great one for nanotech was the Cancer Society of America was a nano technology will eliminate cancer by 2015. So that’s going well, this is the thing I would say to anybody involved in tech development. Just tone it down people, it’s not going to do you any good to overcook it, an AI and some of these ones. I mean, when I is everything from data analytics to the singularity, you know, you’re hopefully reaching people shoot, but you’re probably not. But when we were planning this, we were talking a bit about how

27:24
on a practical level, people can go about trying to be able to identify when

27:30
will tech bullshit basically the, the, the particularly where you’ve got intractable problems that will be magically solved by technology. Yeah,

27:40
so for me, it’s very much about I start with having those people who are trying to sell the technology explained it better to start with because actually there, so in Thrall with the tech angle, whether it be the biotech Andre I angle, the blockchain angle, that’s all we can talk about. So that is a barrier both to that, you know, your guy being able to feel comfortable in making that decision,

28:12
but also, it actually is a semi deliberate ploy to make the other person not feel quite clever enough, and therefore say yes. So what’s been great about my world, I did history of art, I am not clever enough for any of this. So they have to explain themselves to me, and because I’m barely on the chain of saying, no, sorry, guys, you’re gonna have to do it again. I didn’t even get that bit. And it really makes them do it. So the first thing I would say is, do not be worried at all about saying, No, don’t get it. Sorry, don’t get it. Sorry, still don’t get it and making them explain themselves better. And also, if you are running a technology company, train your people help your people. Because a lot of the time someone needed that the the UK guru of the public understanding of science risk, said, Look, some of my guys can even look you in the I never mind tell you what’s the benefit. So you know, this is something that needs to be prioritize within startups and within companies to try and communicate effectively and demonstrate effectively that why this is a solution to a real world problem. So that is my first one. I think

29:25
there feels also there’s something about how, with an emerging technology, not trying to apply it to the trying to apply it to the right sorts of scenarios and the outset. So if I think about the work that I’ve done in the legal services sector, and seeing what conversations that have been around legal services, and AI in particular, now, for me, an emerging and unproven technology should be being looked at primarily as a way to develop new ways of doing business, not just mainly as a way to streamline existing models of business. So the way that a lot of law firms have been approaching AI has been to say, Oh, look, here’s a way we can reduce down our parent legal costs. And so they then go into it with a business model, a business case, that’s incredibly waterfall, because it starts with, we will invest this amount of money, and therefore, we will be able to save these number of paralegal heads. And so therefore, we will make a return on this by x, because of cost saving. And actually, any technology that from the outset, early emergent stage is being purely played as a cost saving model feels to me to be probably deeply flawed. I, because cost saving as a primary objective for projects is usually extremely difficult, because it’s not a great thing for people to be able to motivate themselves around. But secondly, because you save costs on existing business model, that cost will almost invariably then be expected to be passed on to your customers at some point. So if you’re investing to reduce down your headline revenue, you’re making a very bad mistake early on.

31:10
Yes, I totally agree with that very short term. And then imagine the time and effort that you put into that could have been used, looking at something that will perhaps leap frog and elite for a problem and solve something that you know, that really is matters. And maybe

31:27
another another element of this is actually people needing to be better understanding of what it is that they actually do, because actually to apply a new technology that the archetypal example of this is about how the railroads in the US with the emergence of

31:44
air travel will completely blindsided because they saw themselves as the providers of trains, not the providers of transportation and so they weren’t competing with these aeroplane people because airplanes were airplanes and we do trains and sure enough, they completely lost market as a result of that when applying new technology into a scenario you also need to be able to be really clear about actually what is it that you’re doing today to be able to apply it in a way that makes sense doesn’t just lock you into your old established business models?

32:15
Absolutely. I think that’s really interesting is that you see that time and again I find some quite interesting examples of that in the developing world where they’re not really high bound by this huge infrastructure they just want the best thing for the moment so you know the use of of mobile phones in African farming to get satellite weather patterns, indecently I mean, they didn’t care about this, they just want what they want. And that’s where I see the developing world particularly will perhaps take advantage of technology in a way that somehow in the West, we’re a bit too stuck in our own ways

32:53
in terms of being able to make assessment as individuals rather than areas organizations. And we talked a bit about the idea of, I guess, humility and being able to be willing to ask questions.

33:08
What else can people

33:09
be doing, though to,

33:12
I guess, be less afraid of what they don’t know,

33:14
a lot of the problem we have as individuals is, you know, is an expectation that we do need to know everything I had an hour technologies Tell me hearing the public has a duty to understand, I’m a technologist, like, No, no, no, no. And so I think also in society now, where, you know, we there is so much going on that I think we, the more relaxed we feel about our inability to actually do it all I think the better will feel, the more likely we’re feeling to ask questions I do, I do a little mini speech of five things we can do to avoid the tech apocalypse. And a lot of that is Do what you can do. And don’t worry so much about the stuff that you can’t do, and do what you can do in your own world. So you don’t have to understand nanotechnology. You don’t have to understand biotech, you don’t actually have to understand AI. But don’t worry that you don’t. But in the world that you’re in, you know, in the world, that is your responsibility, whether it’s perhaps you know, what Alexa is doing to your three year old child, when it tells them how to dress to, you know what you’re doing, if you’re an HR person in a in an office, and how you’re perhaps not as aware as you could be about how machine learning can be discriminating. Just look at the things you can do. But don’t feel overwhelmed by the things you can’t do. And I think that is the most disempowering thing about the tech discussion. At the moment, it is genuinely overwhelming, and you just feel disempowered.

34:50
There’s also then making sure that the framing of these things is right, and that the problems are the problems that are there. So it was in discussions with some people that morning, actually about there’s there’s a very common narrative at the moment about automation and artificial intelligence, that there will be less work as a result of this. Now, I don’t think that that’s the case, I don’t think any way of of technology has ever got rid of work. But it’s got rid of these types of work. And then you get replacement with other types of work. And often you end up with more work to do, it might be less physically demanding, maybe in the future will be less mentally demanding in ways that we’re not particularly good at. But strengthening work that is more focused on the things that we as humans are good at. But if we frame the whole thing as being AI and automation will lead to less work, we won’t put in place what we need, which is how do we allow people to be able to get much better at lifelong learning about being able to be able to adapt to be able to spot what new opportunities there are? How do we train people for the new skills that they need for the new sorts of work coming up, because we’ll be sitting there waiting for the leisure society to emerge as we have been, since the 19, 50s 60s,

36:01
when I’ve written into that that idea was first put about,

36:05
yes, absolutely. I agree with this. But on the other hand, I still have a slight resistance to this, you know, mantra, if we’re not, not calling you, dogmatic map that’s a definite but a new technology has not resulted in, you know, like a book, why should it Now you only have to look at Brexit and Trump and the wastelands in the north of England, in America to see that, yes, it really does leave people behind, it really does result in lots of work. But as you say,

36:39
what he doesn’t do at what the current system doesn’t seem to do is see that acknowledge that and do something about that in advance.

36:47
Well, what was that technology that left, you know, sways of the United Kingdom in place without work, because if that’s the case in Germany, wouldn’t be Germany?

36:57
Well, technology was not needed anymore. So, you know, mining was not needed anymore, it was superseded by another technology, which might be it I don’t know what we know, whatever. So those technologies in those areas were overtaken by another form of of doing that thing that resulted in them being out of work, but because they weren’t actually trained or for anything else, that nobody wanted to come to these places, then, you know, the they were left behind. And that is the root of one of the big problems of isolation and polarization and of populism that we’re facing and that was traumatized, you know, that the whole of the world. So I still think that is about technology leaving behind people when it’s when it’s moved on, and not responding to that in advance.

37:45
So, so we don’t finish on a downer because I don’t think either of us are anti technology in any in any way. What do you think examples of industries or technology fields who would this better

38:02
I actually think January, no one is doing it better. But in every single technology, there are people who are doing a fantastic job and they really are thinking about these things. And they’re, you know, AI has got a load I was reading some interesting stuff this morning about the whole culture of meat sector and what they’re trying to do and learning from the past and I’m doing quite a lot of interesting work at biotech people trying to do that nanotech trying to learn so every sector has got the list of the people are doing a lot the people who are just waiting to see what happens. And then the people who are never going to do anything give us that the problem for technologies if you’re calling them you know, clumping them in some sort of ology base is how much those people who are committed to doing it badly will infect and ruin your technology in the first place. And how much of that the good people can prevent by and some doing things well, so the self driving car area, for example, and it’s got all sorts of organizations doing it some really trying to lead some really trying to get in under the wire and hoping they can grab the market before everybody decided what the government’s issues might be, and that I really couldn’t tell you how that’s gonna play out. But it happens in every technology.

39:21
And I guess that for that last point, there is actually one of the ones has also been consistent, isn’t it that actually, no matter how much we would like to be able to find that amazing Oracle who is able to be able to predict the future for us and we have done so Dino’s a human trait that goes back forever. The reality is, nobody can tell you what the future is going to hold. No system can tell you what the future is going to hold. no artificial intelligence will be able to tell you what the future will hold. They can probably tell you extrapolated what it might hold and we may be with more and more data are able to get better at those extrapolations. But it’s still not guaranteed?

40:00
No, I mean, I use Airbnb in that example. I mean, there’s some guys renting out blow up base and seven years that are a threat to global tourism and being castigated, and band is not a notable thesis,

40:15
but also that that other question that I, you know, people throw back at me all the time, which is, you know, Henry Ford says, If I listened to my customers, I would have given them faster horses. That is also true. So, it’s a complicated world. And that’s where I really go back to this idea of, you know, personal self empowerment, not being intimidated, not being afraid to ask questions, but also, you know, back to your MMR issue, finding trusted sources of information, just don’t lay out your trust with care is where I’m at, because, you know, the anti the MMR people, they they’re very credible, very credible people with very credible stats, you know, your son has just got autism and you just have the MMR you’re going to go for that, aren’t you? So they’re not unreasonable things these people are saying, but I think one of the things perhaps your listeners and yourself, but trying to do is be more credible, open, honest, authentic sources of information on what we’re going to rely on.

41:22
Or thank you very much to Hillary for that I sounded like a fascinating conversation. And that

41:28
certainly food for thought for those of us who are in the technology game, when you when you hear somebody who talks about the reality of listening to somebody trying to pitch technology, service or solution,

41:43
I’ll often as technologists, people are, so they’re so passionate about what they do, and they live in that world where they, they absolutely understand it, and they they know, inside and top to bottom, and it can be very difficult to understand that other people don’t. And it’s that part where she talks about she, you know, she has to kind of stop people and make them explain it in in simple terms, without without, you know, without being condescending, because it’s Let’s face it, if you if you don’t know about nuclear physics, or you don’t know about medicine, or if you don’t know, but I don’t know horse riding it, you can’t be expected to understand it, understand the minutiae of it. And

42:27
that’s the kind of understanding and awareness that the the tech industry and people working in tech news, isn’t it?

42:34
Yeah, I’d agree. I think the thing reflecting this couple of weeks since Heather and I caught up and reflecting on it since the thing i think i think was most useful for me after that conversation was when talking with her about technologies like

42:49
GMO or nanotech, or biotech, which are things I know nothing about, and how I when I look at those things, I need to think about how I feel about those sorts of technologies when I’m talking with people who don’t come from the world of digital and information technology, because that’s how they feel about the stuff that I’m an expert in. And that’s actually quite humbling. And I think that’s important. I think that’s part and parcel actually, there’s, there’s such a

43:23
is to double headed, isn’t it, there’s a difference towards expertise. And there’s also the kind of Michael Gove Ian hatred of expertise, we’ve got these weird parallels going on at the moment. But for people who are positions of power and authority is very hard often for them to be able to say, I don’t know, and I don’t understand. And that’s a real failing and it’s a really dangerous failing because that’s how you get groupthink is how you get people lead into places where they, you know, make silly, silly decisions. And I think the interesting thing for me about Hillary is she somebody comes from outside of most of those things that she she’s involved with, she’s in credibly knowledgeable, and she has a really good understanding about these things, and can express it in a way that an expert wouldn’t be able to. But because she comes from the world of kind of communications and PR, she, she, she still uses that as a way to be able to say, I’m not from your world, therefore, I’m going to ask this question because I have no problem about I’m, you know, have the humility to be able to ask you and I have to curiosity enough to be able to want to find out more about how it works, I guess a little extent, that’s why I still kind of play the sociology card quite often and saying that I’m you know, I’ve worked in the tech industry for 20 odd years, but still a core I’m a social scientist. And actually, it’s the first time I’ve really made that connection. But maybe that’s part of what I’m doing there. It’s actually buy identify myself as an algebra outsider, it puts me in a position to be able to ask what otherwise would be questions that people would be afraid to ask.

44:53
Yeah, I was reminded of and something I read from Michael Creighton a few years ago. And he talked about

45:01
kind of amnesia effect that people find people have when they read newspapers. So they, they read an article so sort of say you’re, you’re a physicist, and you read an article about physics and you realize the journalists are actually got vulgar or understanding of the whole thing.

45:19
It’s like the Wet Wet streets cause rain type stories where you realize that there’s just your brush, and then you you get exasperated and you and you and you curse the journalists for their sloppy understanding and

45:34
of your field of expertise. And then you turn with interest to your to the international affairs section and read about Palestine or Russia or whatever. And then you you immediately forget that the newspaper you just read the previous page didn’t know what he was talking about. It’s, it’s

45:53
only happens really with that sort of thing. If you if you would, if you work with a company who completely lie to you, or fob you off with some story about something, and it turns out to be false, you probably wouldn’t work with them again, you know, you’d lose faith in them. But we don’t, we don’t do that with with media. And I think

46:15
some of the issues around especially since the world is so much more complex. Now, the the let the level to which research is happening in these different subjects is so so

46:29
in depth that you can’t possibly hope to understand it, even even something like chip design, say, to take something we know something about, even even to even to a relatively

46:41
high level, if you look at the microchip, when we were young. And it was a it was a whole bunch of transistors on a chip and integrated circuits. Well, it was, whereas now we’re getting to the point where we have quantum tunneling effects, because we’re using layers, so thin that the electrons are hopping between different components, and the physics and the and the mechanics and even the design because I think we talked about, you know, I computers designing their own chips. Now, that is also complex compared to what it was a few years ago, that you have to have many, many experts in different fields in order to do that, and you can’t possibly hope to understand it was it wasn’t so long ago, it wasn’t that complex, nothing, that’s the case in so many walks of life.

47:26
Yeah, I mean, you take it to, I guess, in our generation of computing, it was possible for one person to be able to create an operating system end to end so I wasn’t there, I did, if I take it back a generation before that my grandfather designed entire analog rather than digital, but entire analog computers you designed thermal on, it vows for g. c. So he was designing the predecessors to transistors pretty much on his own

47:59
the massive amount of expertise as you say, that the incredibly, you know, complex web of expertise that is required to bear to produce something that even then in its own right, like a, you know, a central processing unit is then still you know, that’s, that’s an inert, useless thing, then without a whole bunch of people around that. So we’ve actually got technologies now, which to understand to be able to build an end to end is impossible for anybody individually state because the complexity, the scale, all the rest,

48:31
but then this is kind of lockout, you know, you’ve got people sort of, on the margins of those different areas who, who, you know, certainly for more research around collaboration, those sorts of professional boundaries of the most tricky part of being able to get teams to collaborate going across professional device, even if it is, you know, the, the different elements of what I don’t even know what professions are required to be design and deliver a modern processor, but no one can imagine,

48:59
then you get two people who are a bit more on the periphery, and then it gets people to completely outside of it. Well, it comes back down to actually is sort of basic level again, isn’t it is back to we need those people who can act as those communicators, and adapters, not just translators, but adapters to be able to allow the communication to be able to happen across all of those different professional domains. So that we’ve got a rough idea about what we’re doing and where we’re heading and the impact of what that might be, and also to feed back into it. And maybe that’s the bit that’s hardest at the moment. And certainly in the digital and information technology space is being able to feed back into say, No, just because you could do this doesn’t mean you should and that’s that’s the bit I think where, you know, thinking about how do we as a society, exert control over technologists who otherwise we’ll just carry on inventing anyway. And that’s, that’s a real dynamic.

49:51
Yeah, that’s very true. That’s very true. When we I think I off our first ever podcast, we talked about IoT, and then talked about the fact that

50:01
just because you can connect to the internet doesn’t mean you should and that’s probably a theme that is perfectly reasonable to carry through. It’s

50:08
absolutely Anyway, thanks against Hillary for joining us. And it was a fascinating conversation of which I actually cut out many, many minutes, because we talked for way longer than you heard there. Hopefully, at some point, maybe in the future we’ll have around again, we’ll put a link to society insight into his Twitter account on whatever on the web page web. 40 podcast.com.

50:31
So there we go, and other show. And that brings us to a look at the week ahead. But rather than our usual prattling on about whatever it is that we’re going to be doing, to be able to earn an honest living or to avoid the ending an honest living this week we’ve got something that is a massive act of self promotion but that’s something that I think is very exciting

50:53
we see you and I have joined forces with somebody else lyrical, pointing the out. And as of Wednesday, the monster dates day I’ve lost track there’s the 13 so Wednesday the 14th 10am

51:07
Greenwich meantime, you empty if you’re that way inclined, the first episode of a new podcast powered project is going to be released into the world. If you’re a web 40 subscriber, you’ll get it delivered automatically to whatever your favorite podcast platform is,

51:27
we’re kind of listed looking at a theme that we’ve touched on quite a few times over the last few months with Paul in which is about the nature of how work is or isn’t flexible?

51:39
Yeah, absolutely. The we did a whole episode on remote working not very long ago. And it’s been it’s definitely been a theme of the podcast how technology affects how we work and and also let’s be honest, it and the ability to, to do things and work with people in ways that you would never have been able to do before because they weren’t in the same town as you can mutable distance. So So yeah, there’s a lot there’s a lot to talk about.

52:10
So the first show will go out, as I say, on Wednesday, the 14th and then there’ll be a moment of quiet and I say moment, I mean it probably a few weeks as we go out and importing goes out and meet people and talk to people about the theme. And if you listen to the first show, you’ll understand a bit more about what’s driving her on what’s driving the idea behind the project. And then we will have its own channel and stream and iTunes subscription all the rest of it. All that good stuff when we get into the the meat and bones of it, you need to think about the show is being basically a bit like that early episode is Strictly Come Dancing before everybody’s been trained how to dance so keep your ears open for that. We’d love to hear your feedback on it. And it gives us a little diversion from what we usually talk about more important, it gives you somebody a listen to which I’m sure everybody will be great.

53:15
Thank you for joining us this week. We can be found as always on twitter at wp 40 podcast on the web at wp 40 podcast.com

53:24
and on iTunes and Stitcher and all good podcast places.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.