The government has announced it is going to make trade apprenticeships free for the next two years for people displaced from their jobs due to Covid-19. From 1 July, fully funded vocational courses will be available in construction, agriculture, manufacturing, community health, counselling and care work. While these will fill predicted sector gaps and provide needed retraining for newly unemployed, there has been no indication yet of any funding for the equally urgent need to retool New Zealand workers for a digital future.
Many of the workers who will lose their jobs as the economic impact of the pandemic unfolds will have a stronger affinity with technical upskilling than in apprenticeships and trades, and with support, can move into future-proofed and higher paid work.
Global digital skills shortage
It has been predicted well before the devastating effects of Covid-19 that emerging digital technologies will displace workers around the world. This is against the backdrop of a global shortage of digital skills. According to the World Economic Forum Future of Jobs report, 75 million jobs worldwide were already expected to be displaced by 2022, while 133 million new jobs would also be created. But to achieve this around 54 percent of employees would need to reskill or upskill.
The Digital Skills Forum notes that while there is strong demand for digital skills across all industries, New Zealand has tended to place an over-reliance on imported rather than locally-grown digital workers. Putting aside whether this was a good thing for New Zealand workers, the strict border controls likely to remain in place for some time will impact on the likelihood of these skilled migrants entering the country in the numbers required.
Workers, businesses and the country benefit from digital skills upskilling
The pandemic provides an ideal opportunity to encourage technology adoption and to re-train kiwi workers displaced as a result. Retraining or upskilling displaced workers in digital skills not only benefits them personally, but will also improve the digital resilience and productivity of our businesses and for the country as a whole. The Productivity Commission’s report Employment, Labour Markets and Income points out that “high-income economies are characterised by highly skilled workers with high capital-intensity jobs and the rapid uptake of emerging technologies.” New Zealand does not currently meet these requirements and because of this, we have suffered from decades of low productivity. One of the recognised ways to change this is through widescale digital training.
A study by the World Economic Forum released last year concluded that aside from the broader societal good of reskilling workers in digital technologies, from a cost-benefit perspective there were significant, quantifiable returns for governments in terms of increased taxes and lower welfare payments.
The emergence of micro-credentials
Not all adult learners are in a position to take years away from work to retrain as an ICT professional. And as technology becomes more widespread, digital skills will be important for all workers, not just ICT workers. The solution provided by many countries is to offer credentialed, online, short courses to quickly upskill and retrain workers for a modern workplace.
One of the roadblocks in New Zealand has been the slow adoption of micro-credentials. Introduced in 2018, by the end of last year NZQA had approved just 73 micro-credentials (according to the Productivity Commission), including welding, project management, and exceeding customer expectations. The process for approval and funding of new courses is also cumbersome and may not meet the dynamic and changing nature of workplace requirements. In addition, New Zealand doesn’t allow micro-credentials to be “stacked” into a larger qualification as in many overseas countries, which also limits the tangible benefits of training for workers.
The Global Response
Many governments around the world have seized on the opportunity to close the digital gap by offering funded and often free schemes to train and retrain their workers for a digital workplace. The job displacement caused by the pandemic has only made these efforts more urgent. For instance:
The UK government, pointing to research that indicated nine out of every ten jobs would be dependent on digital skills over the next 20 years, announced last year that from 2020 all adults will have access to free digital lessons.
Ireland’s Springboard+ offers free government-funded places on 288 courses leading to awards at certificate, degree and post-graduate level. The program is available to people in work and unemployed and are designed to address the country’s digital skills gap.
In Scotland the Digital Start Fund with Scottish Government to help those on low incomes retrain for a digital career with grants to assist in training.
In Singapore, The SGUnited Jobs and Skills Package will support close to 100,000 jobseekers and aims to expand job, traineeship, and skills training opportunities for Singaporeans affected by the economic impact of Covid-19. This builds upon an existing scheme to address the digital skills shortage of Singaporeans by offering a free credit of S$500 to encourage lifelong learning.
From May 2020 Australia’s Courseseeker programme offers workers displaced due to Covid-19 short (6 month) online courses in nursing, teaching, health, IT and science. These “fields of national priority” courses are being provided at heavily discounted rates through existing tertiary and vocational providers.
Given all the initiatives that are rolling out around the world to reskill workers while addressing the digital skills gap, it would be a lost opportunity if New Zealand didn’t address this issue with urgency. A digitally enabled workforce would not only improve our international competitiveness but will future-proof our economy and our workers for the next big technology disruption.
*Alison Brook is from the Knowledge Exchange Hub at the Massey University campus at Albany, Auckland. She is on the GDPLive team. This article is a post from the GDPLive blog, and is here with permission. The New Zealand GDPLive resource can also be accessed here.
46 Comments
Micro credentials you say? https://edx.org/
There's a big difference between completely online learning and face to face. Online only (and I'm including video link here too) is complementary - suits those learners who preferral processual learning (do as I do), but is notoriously lacking for fostering critical thinking and applied skills (what to do when it's not in the manual). The great thing about microcredentials is that enables the needs of individual students as they build on what they already know - they're not forced into a three year degree programme to acquire what they want.
As I've said elsewhere, the term AI is not well understood by the general public. We're not talking about mimicking human intelligence but making machine learning, which is the means by which software are programmed to recognise instances and/ or event patterns (by video, audio, resistance, or other means) and 'learn' to recognise similar instances or events. I don't mean this as an ad (I'm not associated with this company), but a good example can be found here: https://www.awaregroup.com/
So AI is not 'intelligence' in the commonly understood sense of the term, but neither is it smoke and mirrors. It's just about being able to scan extremely large data sets and identify patterns that were either not apparent before or took many hours of human time to achieve.
Having directly worked in AI for an AI company. I can tell you that it is still mostly "smoke and mirrors". At the moment AI is good for pixel processing; example scanning thousands of images to spot potential skin cancers far quicker then any human. Yes it's good for things like that. But anything that requires more complex interaction with humans, is just fluff. We're still many, many years away and at the current AI level this is mainly expensive tech gimmicks.
Also if the government wants to support growing tech industries that produce large amounts of revenue and can setup very quickly, You only need look at high growth CG companies (Computer Graphics), prime example is Weta. Though we need more studios like these and circulated around NZ along with computer games companies which also can produce high amounts of revenue.
The UK has benefited from investing in these tech growth areas for a very long time, why can't we? VB article; With record-breaking revenue, the U.K. game industry is blowing up. https://venturebeat.com/2019/03/18/with-record-breaking-revenue-the-u-k…
The bitter truth is we haven't even got adequate infrastructure to express transit a shipping container of PPE across the country letalone provision newly jobless the opportunity to upskill to ICT, thats why a significant number of jobless will need to move into construction work, I just hope theres enough investment in equipment, mixing concrete by hand sux!
On the contrary, one thing that NZ has done very well to get done - and kudos to the previous government for getting this right and not leaving it to the market - is putting in fibre internet as a public utility. This is a core piece of infrastructure that sets us up well for creating digital products and services to sell anywhere in the world.
You have to be kidding, the UFF rollout might be better than what some countries have but thats a pretty low benchmark. The UFF was built on the cheap and certainly wasn't well executed if like me you have ever tried to operate a data rich business in the provinces or even on a lifestyle block...fast internet is a myth, its simply not available.
Many years ago I started to learn the guitar for which I had to digitally train my fingers to do scales. I found that my fingers were too short and stumpy to master the 4-fret stretch required to efficiently master the scales......there was nothing I could do to achieve the required level of digital skill. So, I suppose digitally I'm a failure and thus not suited to the modern world.
Seriously, where is this new acquisition of digital mastery taking us? Would someone please describe what this digital utopia will look like?
Billy Idol's guitarist Steve Stevens (who also played the Top Gun theme) also has short stubby fingers. You may have given up too easily.
A lot of people suddenly upskilled in technology during the lockdown, simply through necessity. Application and practice is much more important than instant aptitude.
Don't mean to take your comment too literally, but I think you're conflating digital with 'finger like'. The difference between 'digital' and 'analogue' has been defined in lots of places, but I think Manovich sums it up well when he says analogue is continuous data and digital is discrete and sampled data. He then goes on to say that such data can perform operations that is not possible in analogue: it can be replicated without information loss, is therefore modular in nature, can therefore be combined with other modules, is therefore subject to algorithmic operations etc.
All of which might be possible with your fingers on your guitar, but suspect the n-dimension fluid dynamics might get a little tricky.
Where is it taking us? Who knows, and we can't know, but it's obvious that it's going somewhere and we need more people to meet that demand. My workplace is in the IT sphere and there's people who've immigrated from all around the world. Not that it's a bad thing, as we need them and we don't have enough of the various skills in NZ. What I do know, is that if we don't want to a) keep creating an underclass who are stuck in various combinations low-end jobs & unemployment and b) diversify NZ's foreign earnings, then we need a lot of education & training in this area.
It's not the education/lack of credentials that is the problem.
The problem has always been the companies defining (and in some cases inventing) ultra explicit credentials as a way to justify access to offshore labour.
Just give the kids a go, you will be surprised at what they can do.
I know a lot of great coders, and they were all self taught, often in their teenage years.
I agree.
Lots of good coders are self taught. But there is a self selection problem there - the majority of coders who are self taught are also relatively smart people, who require a tool to solve a problem.
However, the excellent coders are those who support their self directed learning with formal education - maths/stats, comp sci, logic, etc. These are are the type who are truly required - modern coding requires parsimony and clarity. Ya just can't (generally) get that from the school of hard knocks.
The real problem is coding is not, and never will be, a skill which can be learnt by everyone. The best you may get is 50% of the workforce being able to write a hello world on vba. That isn't coding, and we shouldn't be forcing people to specialise in this area if it is at the cost of increasing their skills in an area where they are better suited.
Totally agree. And I also think that a completely industry led approach will only lead to outcomes that are considered commercially viable at any particular time - it doesn't expand the field per se.
Also - and this is probably my pet rant - 'hello world' approach suits a particular type of 'if this then that' thinking, but doesn't foster thinking where logical relations have to be circumnavigated to achieve a desired result. I see this all the time at the intersection between coders and designers - both are nonplussed at the apparent lack of knowledge of the other.
So my approach is to not to attempt to identify the coders from the designers in high school in the first place (an institution which has been shaped by a much older factory model), but to enable people's own aspirations at the outset so that they self-motivate their own direction. In order to increase efficiency of this otherwise very long process, we need a tertiary approach that has a greater gearing towards microcredentials! The ability to 'stack' them together makes a lot of sense, as a 'degree' is still considered desirable by many. One of the consequences of this is to have learners who don't just come from a single domain - divergence, not convergence.
I'm not so sure it's hard to identify who will be good at coding. What kind of hobbies does this person have? How is their ability with abstract logic? Do they enjoy learning (constant learning being a prerequisite of working in software)?
I started recently at a software company, having no previous experience in the industry; they told me, after hiring me, that it was largely because I have a philosophy degree, and if you can do logic you can do coding. The company is full of people who basically learned on the job. They are constantly upskilling and shifting roles, because it is impossible for any one person to understand the entirety of a complex software package.
I really appreciate that I wasn't written off because I'm 'too old to learn new things' (at 35) and don't have experience or qualifications in the field. They know it's really not that hard to figure out if someone is smart or not (the right kind of smart) if you interview them and ask the right questions. Beyond a certain 'raw ability' with abstract logic, willingness to learn is more important than anything else. I don't doubt that there are thousands of people out there in other industries who would be very capable at coding if given the chance.
You are an example of my previous comment, then.
However, the excellent coders are those who support their self directed learning with formal education - maths/stats, comp sci, logic, etc.
There are so many fundamentals associated with these skills that if you don't acquire them, your learning ability is significantly slowed and/or limited.
I'm not so sure it's hard to identify who will be good at coding. What kind of hobbies does this person have? How is their ability with abstract logic? Do they enjoy learning (constant learning being a prerequisite of working in software)?
Sure, you can do this with school kids. However, development of such skills doesn't proliferate until after the adolescent years (in the majority of people).
So, although you might identify a handful of worthy in the 99th percentile of high school students, this is nowhere near enough candidates to satisfy demand.
I agree that chasing high school kids isn't the best idea. Added to which, you don't necessarily want all your best and brightest in IT - we need them in science, medicine, politics(!) etc. My point really is that there are a lot of adults out there who would be good at these jobs, but the tertiary education system is still geared towards 18-year-olds rather than 'grown-ups'. It's very difficult, financially, for adults to leave a low-skilled job to train in a new field - entails a big cut in income and a substantial new debt - and most employers aren't good at taking a punt on someone who seems to have the fundamental qualities they need and giving them time to learn the skills.
Picking anyone is a lottery for employers, and they used to take people straight out of school for many years ("Back in my day I never went to university! I used to walk 4 miles to and from school each day. Uphill! In both directions!"). It's also not hard to identify strong coders out of school.
Technology is merely the means of increasing energy efficiencies. That process tails off as it hits thermodynamic limits. All those people who 'earn' via digital manipulation, expect to be able to swap the digits they are 'paid', for processed parts of the planet (stuff on their supermarket shelves, in the car-yards and hardware stores).
The correlation between this suggestion (which could only come from a flat-earther (an economist, in other words) and the amount of processed planet is? Zero.
So this, if it were an academic offering, should be rejected. Only in economics could it be accepted. For a physics-type, it represents urging people on an upper Titanic deck, to write more apps so you can pre-order your deckchairs. Clearly, this discipline cannot anticipate, verify, or monitor, sinkings.
A fail.
Ironically, not that you knew, those flat-earther economists also have something called a 'Luddite fallacy'.
Synthesizing it to a general sense, it is very pertinent to the PDK position on technology.
Ya gotta admit - for as much as you berate them, those pesky economists are always one step ahead of you.
Maybe formal education isn't so bad, after all.
PDK, in a Newtonian mechanical world I would agree with you, but not a quantum one. Technology is part of emerging phenomena, both in the how things are done matters (inclusivity of phenomena is simultaneously exclusive of other phenomena), and through the intra-actions between actants. (Thermodynamic limits are not limits per se, although they are boundary setting conditions). My point being that continually emerging phenomena is messy, not mathematically clean, and technology is not neutral or inert. The desire to verify phenomena which derives from rationalism, does not unfortunately account for daily experience which is neither repeatable, nor frequently, verifiable.
So in an exchange system where such events are tallied in a one to one correspondence such as the one you describe, I agree, such accounting is flawed. But quadruple bottom line accounting has moved on from this understanding of economics (also a technology), although implementation is another matter entirely..
So where I do agree with you is in relation to the idea that digitality is somehow without an environmental impact. All digital operations generate heat, require physical space somewhere for hardware (even cloud servers live somewhere) and require tangible resources for electrons to flow through (copper, rare earths, plastic etc). Not to mention Cartesian mathematics itself.. (another technology, which requires an origin point zero to operate).
Technology can indeed increase efficiencies (when we rearrange the deckchairs on the Titantic, can we do so I'm the most energy efficient manner), I'm just disagreeing with your logic..
And in a perfect example of a non-Newtonian energy source, try Nuclear....which blows a Fat Man-sized hole through simplistic munny=energy entrails divinations. France gets 80% of excited electrons that way yet still somehow manages to achieve quality of existence at an enviously high level. Banlieues and the 700-odd 'Zones urbaines sensibles' excepted, but of course.
Ar the end of the day, a carrot, a potato chip and a Ford Ranger, are physical, are processed planetary parts, and are the end-point (after all the churn) for money. They - as Soddy pointed out a long time ago - are the true wealth. Money is just debt - an expectation currently in abeyance, of wealth to be obtained in the future.
Until we account for stock depletion (including sink-capacity, like the atmosphere absorbing CO2 or the ocean. plastic garbage) we are accounting falsely. Until perhaps 1970, we could have accounted properly. Now, I suggest, there is too much debt and too little planet. You can divert the problem into the virtual world and buy a little time, but every computer-nerd still needs fed. With real food.
I agree. And virtuality - whether that's machines not being able to recognise the difference between tangible hardware or software versions of other machines, or notions of graphical virtuality - still generates heat, requires copper or radio waves for transmission, requires energy expenditure etc etc. So that's not buying time, it's just shifting it around somewhere out of sight..
I did see a documentary on tv some years back which demonstrated that those on the autism spectrum are likely to be the best at computer programming which I would guess means 'coding'.
I'm sure I haven't got autism because I am certainly 'digitally challenged'. Good god, is that the time!
We welcome your comments below. If you are not already registered, please register to comment.
Remember we welcome robust, respectful and insightful debate. We don't welcome abusive or defamatory comments and will de-register those repeatedly making such comments. Our current comment policy is here.