How can we prepare our children for AI?
The third of three debates on Artificial Intelligence wades into the perilous business of predicting technology, asking what skills schools should be teaching to equip students for technological upheaval
When Victoria University of Wellington’s Dr Stephen Blumenfeld was finishing high school 45 years ago, his career counsellor told him and his fellow students that the job of the future, the job guaranteed to be around their entire working lives, was key punch operator.
What’s that, you may be asking. And if you aren’t, when was the last time you heard the job mentioned? Exactly.
Predicting technology, and its effect on work, is a perilous business, said Blumenfeld in the last of the University’s three AI Debates, the theme of which was employment.
Director of the Centre for Labour, Employment and Work in the School of Management, he was asked by a teacher in the audience what skills schools should be teaching to equip students for the upheavals ahead because of artificial intelligence (AI) and other technology.
Schools – and tertiary institutions – should focus on teaching the ability to learn and accept new ideas, as well as to think critically, said Blumenfeld.
“If you were to teach skills today that were to apply to jobs that exist today, especially if you are teaching students in the first year of high school, by the time they get out of high school, or certainly by the time they get out of university if that’s where they go, those skills will be completely irrelevant.”
Technological change is affecting not only the kind of jobs we do but also the way in which we do jobs, he said.
It has given rise to the ‘gig economy’ – a new category of jobs that fall between employee and independent contractor and are being termed ‘dependent contractor’.
Like independent contractors, dependent contractors are not covered by most employment laws, receiving no paid holidays for example. But although they are deemed self-employed and liable to pay their own taxes, they depend on one organisation for employment.
Some countries have already started addressing the issue by defining dependent contractors as essentially employees and allowing them to be covered by employment laws, said Blumenfeld.
But New Zealand is yet to do so, he said, and continues fitting the ‘square peg’ of gig economy jobs into the ‘round hole’ of legislation created for the labour market many years ago.
Regulation – around employment but also other aspects of change and existing economic issues – will be crucial, said Dr Olivia Erdelyi from the University of Canterbury, whose research focuses on how society should deal with AI.
The potential benefits of AI are “really only possible with massive redistributive regulatory measures on the part of the state. Unless this happens, it can actually make us worse off. It can aggravate inequality, it can hollow out the middle class, it can slow down economic growth and it can ultimately decrease our welfare”.
Erdelyi also emphasised the need to be proactive about retraining workers whose jobs are threatened by new technology.
“The danger I see is we act too late, because we have a poor track record in being proactive. Just look at the financial sector – with everything virtually. The problem is once we get to the point where people are actually starting to lose their jobs, that can take the economy into a downward spiral where it’s very hard to stop it. We have to act before that happens. Because if we are too late then we can end up with another Great Depression, which we don’t want to.”
Companies can do a lot too, and some are already, said Erdelyi.
Dr Jo Cribb, former CEO of the Ministry for Women and co-author of Don’t Worry About the Robots: How to Survive and Thrive in the New World of Work, spoke about Ports of Auckland as one of the New Zealand employers “being really responsible”.
Some of its staff “will be automated out of their roles in the next year or so but their leadership, knowing this was going to happen, proactively offered support for workers so they can prepare for new roles by the time they have lost their jobs”.
However, Cribb worried about the consequences of a lack of coordination if different companies in the same sectors were to shed similarly skilled workers simultaneously.
“I just think we’re at a very unsophisticated level of debate. In that it could happen next year [with some jobs]. We could end up with seven or eight thousand people with the same skillset dropped on a labour market and that’s going to be really hard to absorb.”
Cribb and other speakers explored the inroads AI is making into the job recruitment process, including machine learning programs to screen CVs and even a robot head that conducts interviews and then sends the transcript to the hiring manager.
DeepSense “is the one that made me feel a bit uneasy”, said Cribb. “It’s a program that analyses a candidate’s social media or anything you’ve published online and it not only looks at the content but it analyses the words you use and builds a personality profile and tries to predict how you’re likely to behave in the workplace.”
Ben Ritchie, a Senior Advisor at analytics firm Nicholson Consulting, said people often ask how we avoid AI resulting in a dystopia.
“I would argue that within the job market, and particularly in some industries, dystopia already exists. It exists through unequal employment rates for Māori and Pasifika, it certainly exists through the gender and ethnic pay gaps that exist too. Obviously, this is a bad situation.
“What it means is those automated decision-making tools are more likely to be used in situations where employees have relatively less power. They are more suited to that process. Because going through a process where you are recruiting a data scientist into a small team, where you’ve got a small pool of people with requisite skills, it’s more a matter of ‘Do I like this person? Will they fit in with our culture?’ It’s more of a human decision-making process.”
Meanwhile, he said, “automated decision-making will continue to expand and be more important but it will continue to disproportionately affect those people that are most disadvantaged in the job market already”.
Associate Professor Jane Bryson from Victoria University of Wellington’s Business School, whose research includes recruitment and selection, stressed the need to avoid replicating or creating biases through use of AI.
She also raised the issue of where power lies in the recruitment process.
“I see AI as a useful tool if it is managed well and we get transparency about what lies behind the algorithms. But my concern is that currently AI puts a lot of power in the hands of the recruiter and the employer and very little in the hands of the applicant.
“That power takes two forms. One, it’s about the increasing amount of information collected on you as an individual and the judgement weighed on it. And the applicant has no real input or influence over that information once it has been scanned. The second element of that power is a real distancing from the human or social relationship you enter into in an employment relationship. What that has the effect of doing, if it is relied on too much, is reducing applicants to a cog in a machine. As soon as you distance yourself at a human level, it’s much easier to make hard decisions that don’t take account of the person at the other end of it.”
Bryson proposed rebalancing the power relationship.
“I would like to see AI that makes freely available an assessment of the likely behaviour of your prospective employer organisation. I would like to see AI that looks at organisations and assesses whether they live up to their stated values. I would like to see AI that look at how those organisations are rated on social media. How are they rated by external agencies? What are their health and safety statistics? What are their personal grievance statistics? What are their customer ratings?
“Why don’t we have AI tools that put some more power back into the hands of the applicants instead of just the recruiters and employers?”
Read the original article on Newsroom.