Research output per year
Research output per year
Janja Komljenovic, Valerio De Stefano, Mariya Ivancheva
Research output: Other contribution
Original language | English |
---|---|
Type | Seminar video |
Media of output | Youtube video |
Publication status | Published - 19 Sept 2023 |
Research output: Contribution to journal › Article › peer-review
Research output: Contribution to journal › Article › peer-review
Research output: Contribution to journal › Article › peer-review
}
Research output: Other contribution
TY - GEN
T1 - AI and academic labour
T2 - seminar in AI in higher education - a CGHE webinar series
AU - Komljenovic, Janja
AU - De Stefano, Valerio
AU - Ivancheva, Mariya
N1 - Transcript : 0:00 hello everyone and welcome to this CGH webinar which is the fifth in the series 0:05 on AI in higher education um this is an exciting and timely series which covers six webinars 0:12 with excellent speakers on various aspects of AI in the sector so my name 0:18 is yanya kovich I'm based at Lancaster University and I'm the organizer of this webinar series on AI for CG um and today 0:27 we will be hearing from Valerio dfan and Mar ianeva on AI and academic labor I'm 0:34 very excited about this webinar today um Valerio dfan is not a higher education 0:39 researcher but his work is incredibly valuable for our field he is a professor 0:44 and Canada Research chair in Innovation law and Society at York University in 0:50 Canada uh and I'm appreciative valer of you being here in the morning where you are based um and his work focuses on 0:58 technology and labor uh he's a speaker and adviser at various institutions 1:03 including the European Parliament and the oecd and he will speak to us about algorithms and Ai and how they affect 1:10 labor in general we will then hear from Maria ianeva Maria is a senior lecturer 1:16 at the University of stth in the UK based in Glasgow uh she is an 1:22 anthropologist and sociologist of higher education and labor and has had academic 1:27 and activist Focus for many years Maria will act as a discussant to contextualize Valeria's talk for the 1:34 higher education sector specifically I'm truly grateful to both of you for coming today and I think this is a really 1:41 remarkable panel but before I hand over to our speakers there are some brief housekeeping points to mention so this 1:49 webinar is being recorded and will be posted online on the CG website um in a 1:56 few weeks a transcript of the chat function conversation session will also be posted we recommend using the speaker 2:03 view so you can more clearly see who is talking to ask a question use the chat 2:09 function and write out the question you wish to ask at the end of the presentations if your question is 2:16 selected you will be invited to ask it yourself directly please keep yourself 2:21 muted during the talks and then when you have been asked to speak uh or ask a 2:27 question please unmute yourself switch on your video and state your name and where you're from about the video uh 2:34 there's no need to have your video on during the webinar during the talks but please do so when asking a question if 2:41 you can all right so now I will pass over to Valerio Deano for today's um cghh 2:50 webinar thank you thank you very much y uh and also thanks Maria for uh agreeing 2:56 to be a discussant and uh thanks a lot for for this invitation I'm very excited 3:03 to to be here with you I will try to uh share my 3:11 slides 3:24 yep yes so the title of my presentation is negotiating the AL alith and I it 3:30 will be clear uh while I speak that when I say negotiating the algorithm I mean 3:36 workers should be able to have a voice in uh the algorithmic systems that uh 3:44 increasingly intervene in their work they oversee direct and in some cases 3:50 discipline their work uh this is a research that is not done in isolation 3:55 it is uh join with other research that I'm conducting on uh platform labor for instance on 4:04 remote work for office workers on Fundamental Labor rights as human rights 4:10 and many other things that are normally uh captured in the category of the 4:15 future of work uh the problem with this category is that in many cases uh the 4:21 discourse around the future of work has been an uh covering individual aspects 4:27 of labor and employment regulation and protection uh aspects such as dismissal 4:34 uh precarious work and not standard work uh unemployment benefits but very rarely 4:39 uh the research has been focusing also on the collective aspects um if you are 4:46 more interested in um all this line of research two of my recent uh Publications together with other 4:52 colleagues and friends are a monograph entitled your boss is an algorithm uh that I wrote together with uh Professor 4:59 Antonio lis and uh a research agenda on the gig economy and Society written with 5:05 other uh authors and and co-editors that 5:10 um more or less address everything that we're going to discuss uh 5:15 today uh another thing that uh this presentation is not going to be focused 5:20 on even though it is certainly an important um issue that maybe will come 5:26 up in the uh in the debate is how many jobs are we going to lose to technology 5:33 how many jobs are we going to lose to automation uh this is something that is quite important but it has been covered 5:39 extensively in in economics scholarship in sociology Etc uh while this is quite 5:47 important we also think that is extremely important and not us under 5:54 the uh the uh under the radar as the e 5:59 the quantitative aspects the qualitative aspect so how does technology affect the 6:06 way people work uh the people that want to lose a job to automation will 6:12 anyway uh be uh extremely affected by the new 6:18 technological um innovations that are reaching our workplaces So This research 6:23 is about that uh topic and in order to go into how technology affects our work 6:31 we have to do a a step back um a step back because when we talk about um 6:36 alation when we talk about technology and and work and in general when people talk about employment law or labor law 6:45 uh they think that labor law is just a protective tool they think that labor 6:50 laws are only there to protect workers this is quite an oversimplification and it's something 6:56 that you find always in um in scholarship in policy papers um this 7:03 is another over simplification because labor law also creates and enshrines 7:09 certain powers and prerogatives on the side of employers employers have the right to direct their Workforce to 7:16 instruct them to tell them what to do they have the right to monitor what workers are doing they also have the 7:23 right to discipline workers that don't comply with those rules and all this uh 7:28 set of power are extremely um intense in workplaces and 7:36 are something that doesn't exist outside or it shouldn't exist outside the 7:42 employment relations the the the relationship of an employee and an 7:47 employer um what is the problem the problem is that uh we normally see label 7:55 law just as a protective tool while instead it is first of all a tool that creates all 8:02 these powers and then in some cases tries to react to the arest abuses that 8:09 can stem from the exercise of these powers and another um I will say over 8:15 simplification about Labor protection this time is that it only covers uh you 8:21 because you are a weak subject you have a weak bargaining power and it only 8:27 protects your economic welfare if you want so it enshrines the minimum wages 8:33 it gives you uh parental leave uh Collective rights this is to increase your bargaining power but at the 8:40 workplace we bring ourselves as persons and as persons we are also carrying with 8:47 ourself or we should carry with ourself all our human rights and some of the 8:53 managerial prerogatives of employers can infringe on these human rights they can affect for instance us uh and 9:00 discriminate us they can Bridge our privacy uh they can stifle our voice so 9:08 labor protection is not just about making you a little bit better in 9:14 economic negotiations with your employer is also there to protect our human 9:19 rights at work the thing is all this protection that has been um uh 9:27 stratified over years is it keeping Pace with what's going on in our workplaces 9:33 and the answer may be well no it is not uh new technologies are used by 9:40 employers to um basically exercise all their powers 9:48 and augment their powers and prerogatives they are used in iring so 9:54 where you um if you submit your CV to um 10:00 most companies right right now most big companies they will first scheme CDs on 10:06 the basis of automated software uh that would uh again select the few ones that 10:14 get to an interview uh we can talk about it but there's a lot of discrimination 10:20 involved in the software selecting people they have been proved to be discriminated against women and 10:27 minorities uh during interviews we are subject to surveillance when we do a a 10:33 job interview in many cases this isn't is recorded and in some cases employers 10:39 use software that tracks our eye movement our ey lead movement our uh 10:45 body uh our body language and this software can basically tell something to 10:52 our employers about what we think or uh what were our feelings during the 10:57 interview even though this is absolute uh balloony there's no software that can 11:02 tell what you're thinking emplo these software are traded on the market as able to do that uh workers are 11:11 increasingly directed by technology in what they do imagine working in an Amazon warehouse there's a uh a device 11:19 that tells you where to do what uh shelf uh to look for in for to to get your uh 11:26 to to get the item that you have to uh that you have to collect uh but also this uh this software is 11:32 used to track what you're doing if you are a blue hola worker they may follow 11:38 you with a GPS and see how fast you go for instance in the warehouse but if you 11:43 are a remote worker or if you are an office worker in general softwares are used to understand how productive You 11:50 Are by using Proxes that sometimes are uh very unreliable how many keystrokes 11:57 you give to your keyboard in a certain moment of time uh or taking random 12:02 screenshots of your screen and if you are uh those screenshots are taken in the wrong moment where you are just 12:09 relaxing and reading the news to clear your head then your manager can uh know 12:15 about that and can take action against you um there's badges that people wear 12:21 that also incorporate GPS and follow them around all their movements in the office track interaction with other 12:27 colleagues in some cases record uh not the content of a conversation but the tone of voice of 12:35 people to say to to understand whether people are nervous they they if they get nervous when they have certain 12:41 interactions with uh certain colleagues um and all of this monitoring all of 12:47 these monitoring particularly for office workers for white collar workers has been skyrocketing during the pandemic 12:54 during the pandemic many workers were uh were forced to be at home and 13:00 employers were forced to leave them working from home and the reaction from employer to this loss of power in the 13:07 office was let's introduce a bunch of uh software that H helps us monitor workers 13:15 and Proctor what they're doing all of the time and now that this uh software 13:21 have been bought they are not being rolling back and even if you're going back to the office this software May 13:27 well be tracking you even of the office and all of this uh data Gathering all of 13:36 these uh technological Direction monitoring supervision can also result 13:43 in disciplining people through technology so algorithms can fire people 13:49 or suggest to fire people that they regard as unproductive uh Big Data are 13:55 collected to understand whether someone is looking for an another job or is looking to do certain um parental 14:03 choices for instance uh having babies and uh and other things by using your 14:09 browsing and data history so we have all of this uh extremely invasive use of 14:18 technology that really interferes with the way we work in ways that if we don't 14:26 sit and analyze them we tend to basically just now uh take for granted 14:31 but it should they shouldn't take it uh shouldn't be taken for granted at all uh 14:37 normally the employment relationship has also always involved a certain degree of control on the body and mind of people 14:44 the employer can tell you what to do and can tell you also what to say uh but in 14:49 general workers still retain a certain degree of mental privacy and these new 14:55 technologies want to uh basically get around even that minimum level of 15:02 privacy that you would have uh in in your ads either by detecting what you're 15:08 thinking or uh which is even worse by assuming what you're thinking on the 15:14 basis of snake oil technology that purports to tell the employer what you're thinking what you're doing and 15:21 what you're going to do next um the use for instance of ADB of uh uh 15:30 other things that scan your brain activity at work is on the rise as 15:38 well uh so this is important because as I said um we are not in uh the business 15:48 as usual anymore this is not something that we have seen in the past we always had control and supervision of the 15:54 workplace but the kind of control that this software allow is first of all Relentless you you couldn't put a a 16:03 supervisor on the shoulder of every worker in a factory but you can put a software on the shoulder of everyone of 16:10 us that tracks whatever we're doing every moment in every moment of our 16:17 workday they uh this software basically suggests managers what to do and in some 16:24 cases managers are uh with their hands tied if a software that has been bought for uh imagine million of pounds uh 16:33 suggest that a certain worker is unproductive and should get the sack the manager is G is not goingon to that get 16:39 in the line of fire of that even if they think that the software is wrong uh it's 16:44 very difficult that managers will actually uh try to overrule uh a decision that come through technology 16:50 technology is also perceived as neutral and this is its power in this uh in this 16:57 framework uh because uh again managers employers will think well this is just 17:02 about data it's just about math it's the math that says that you are unproductive and and you should get fired uh and as I 17:11 said and as we can discuss uh more uh more in deep uh these 17:18 software not only in many cases are snake oil they are unreliable in many 17:24 cases they're also discriminatory in nature in many cases they have a a 17:29 certain Benchmark of a worker and that worker is a young uh white male in again 17:36 in the prime age and everyone that doesn't um correspond to The Benchmark 17:41 uh risk getting uh misjudged by the software uh in a disproportionate way 17:48 compared to the Benchmark um the scope and the intensity of 17:55 control as I said are uh uh extremely uh invasive and 18:02 what's more this software don't doesn't distinguished on employment status so on 18:10 you being a freelancer or an employee uh or you being an employee or an 18:16 independent contractor okay normally employers could only exercise managerial 18:22 prerogatives to that level uh Visa employees and the law reacted to that by 18:28 providing employees with certain degree of protections that uh self-employed 18:34 people didn't have because they were not assumed to need this protection right now think about Uber drivers think about 18:42 uh the Liu Riders think about free um precarious teachers in many cases they 18:50 might not be under unemployment relationship but the kind of power the employer uh or principal or whatever the 18:57 company exerts on them is St amount or even more 19:03 intense uh compared to the one that is exerted on U traditional 19:09 employees now there are certain rules uh 19:15 around the world in Europe including uh some rules that apply to the UK because 19:20 even if the UK has left the European Union is still part of the Council of Europe so the con the European 19:26 convention on human rights applies to um UK uh subjects and also to UK workers 19:35 and uh this convent the convention on human rights has been interpreted to protect a certain degree of privacy at 19:44 work uh there are other instruments that also protect privacy of work the the 19:50 most important is of course the uh gdpr which has a certain degree of protection 19:56 against fully automated decision making so when an algorithm decides everything 20:02 and doesn't need a human to validate its decision then there are certain 20:08 safeguards that uh are in place including at the workplace even though 20:14 at the workplace again as I said there there's always a certain degree of power that is given to the employer by default 20:22 uh the problem is that privacy is not enough uh first of all privacy uh the 20:29 scope of uh uh of privacy protection doesn't doesn't normally include 20:35 inferences so if data collection is covered but what the computer thinks out 20:42 of this data is not necessarily covered by privacy protection so if the computer 20:49 or dat the data collection says that they buy a a lot of junk 20:54 Foods um that data collection may be covered by privacy protection but if 21:00 then that data is collected anyway and the inference is that that since I buy a lot of junk food I'm gonna get sick that 21:09 inference from junk food to I am gonna get sick is not necessarily protected by 21:17 privacy protection so uh the scope and the um the real effectiveness of privacy 21:26 in it is of course important but is not enough uh on top of that privacy 21:33 protections and including the D GDP may fall to keep Pace to the newest ways of 21:41 inv of invading our privacy uh that the the Privacy that um invades your brain 21:48 data the Privacy that invents your uh invades your mental and emotional 21:55 privacy uh that kind of Technology um has been introduced massively after 22:04 all these standards came into place so we only have for the moment um one 22:10 recommendation from the OCD which is not binding it is still uh significant but 22:16 not binding that calls for the protection of uh of brain data and it calls the 22:24 protection of we can call it neuro uh price but again other standards May uh have 22:31 been adopted before this became an issue that had to be considered by by 22:37 regulators and also this is not just about privacy uh it is much more than that as 22:44 I said uh these powers and prerogatives exert uh and and and can 22:53 invade and infringe on your rights on a much broader scope of Rights than just 23:01 privacy they can affect your right not to be discriminated against they can 23:06 have effects on your occupational health and safety they can stiffle Collective 23:13 labor rights the ability to exert a voice the ability to join a union the ability to engage in Collective action 23:21 so it is not just about strengthening privacy protection it is about reacting 23:27 to managerial prerogatives and the ways in which technology enhances the powers 23:33 of employers in a more holistic way and to 23:38 do that you have to do this by using again Collective rights by using uh 23:45 rights such as the right to bargain collectively to try and uh in a way put 23:53 some Reign and uh rationalize the ways in which technology is been introduced 23:58 in our workplace and it's used to Monitor and supervise us uh as I said at 24:05 the beginning Collective rights and labor rights in general are not just there to make you a bit better 24:12 economically they're also there to rationalize to control to make sure that 24:18 employers don't abuse their managerial prerogatives in ways that we will 24:24 consider unacceptable so they are also there to protect us as human beings at work to protect our human dignity and 24:31 many of the systems that I have mentioned particularly the one that invade your mental privacy 24:37 uh can definitely affect and infringe upon your human dignity so the uh the 24:46 the collective Dimension here is essential we are not going to get anywhere meaningful just by giving us 24:53 individual employment rights that maybe we'll have to enforce in court and maybe will be enforced after costly uh 25:01 litigation years after the um anything bad occurred okay so we have 25:09 to be uh proactive and only Collective rights can effectively be proactive at 25:15 the workplace um there are certain instruments I'm not going to B you to 25:22 that with all the legal uh Provisions about it but there are already uh legal 25:27 inst struments that can be used to um to better um control technology 25:36 through Collective rights to voice information and consultation standards are uh are some of them um there are 25:46 certain convention from conventions from the international labor organization that um promote collective bargaining on 25:55 uh anything that has to do with conditions of in terms of employment so technology definitely affects on that so 26:01 they can also be used these instruments to uh better protect people at work uh 26:07 certain Collective rights are also uh recognized in the 26:14 gdpr uh and when I say what the collective bargaining must have a place 26:20 in this I mean that Collective agreements uh both at the sectoral level 26:25 which is not something that is super common in the UK it's more common in Continental Europe but also at the shop 26:31 level also the company level should regulate the kind of technology that is introduced particularly the monitoring 26:38 technology that is introduced at work uh how data are collected and how and the 26:44 transparency of that data protection and uh the storage and processing of these data ban the most abusive forms of data 26:54 collection including data on your mental and emotional privacy and ensure that human maintain a certain 27:01 degree of oversight on the operation of algorithms if algorithms are black boxes 27:07 if algorithms cannot be explained and the result of what comes out of the algorithm 27:14 cannot be explained as many programmers report that they cannot explain what the 27:20 algorithm is doing well the case should be that you cannot use something that you can't explain over people 27:28 um very importantly I I want to to be clear about that Collective rights don't 27:36 stifle Innovation okay the places in which Collective rights are strong are 27:42 not less innovatives that places in which they aren't the Germany uh France 27:47 are not less Innovative than places in which all these rights are not uh respected and compli with um I'm going 27:55 to skip on a bunch of legal instruments uh du is introducing a new AI act that 28:02 has to do also with the workplace and unfortunately doesn't provide for any 28:07 meaningful Collective right for workers so it will provide certain 28:13 safeguards against um algorithmic surveillance and the use of AI that uh 28:21 also interferes with workers rights and uh here we see an the example of an 28:27 instrument that uh has been prepared by people that don't know how workplace work and there 28:35 is no space for Collective voice in the uh functioning of AI at work at least in 28:43 the uh um in this uh draft that we have seen so far and even if the parliament 28:48 is trying to negotiate over changes about that we still are very far from 28:53 the kind of control that Collective voice should that have that is needed to 29:00 basically ensure that technology doesn't overstep uh the platform work directive 29:07 you again is another EU proposed instrument uh it will only concern 29:13 platform workers and platform workers will have a right to get explanation on 29:18 how the algorithms work and very importantly also their trade unions will be a and 29:25 their Union representes will be able to have a voice to be informed about 29:31 algorithmic decision making the decision that are taken the systems that are in place and certain decision will be 29:38 subject to review including um with the presence of a human 29:45 supervisor uh and very importantly the platform work directive um excludes The 29:52 Collection so bans the collection of data that have to do with the emotional Psych logical state of workers what's uh 29:59 the problem is that this only applies to platform workers so workers that work for Uber the Liu the Amazon Mechanical 30:05 tar it doesn't apply to the rest of the workplace to or workplaces so it's very limited in scope uh certain Collective 30:13 rights don't apply to people that are classified as Freelancers so many 30:19 platform workers will still be classified as Freelancers and they won't benefit from the full set of protection 30:26 and uh also importantly in many places in Continental Europe when you want to 30:32 install a video camera to on the workplace to do certain levels of 30:38 monitoring you cannot do that you can you cannot install that telecamera 30:45 unless you have an agreement of the unions or the public bodies or the labor 30:51 inspector uh this is valid for telec cameras and uh somehow is a principle 30:59 that is not valid for much more invasive forms of surveillance that can come through algorithm Management in the 31:05 platform work directive we only have a right um uh as workers and unions only 31:10 have a right to be consulted um and to be informed but not 31:15 a right to negotiate not a right actually to veto in some cases certain 31:21 um the introduction of certain Technologies which is a standard that is far less protective of the standards 31:28 that we have on video cameras even though the kind of invasion of privacy and all the other rights that I've 31:34 mentioned so far that can stem from new technologies is much more significant 31:41 than what can come out of a simple video camera so we see that there's always a 31:47 certain level of uh techn determinism involved when Regulators um in interface 31:55 with uh with technology they assume that everything that is possible that 32:00 everything that everything that is technically possible should also be allowed and this is something that uh 32:06 arguably we should um we should um object to uh not 32:14 everything that technology allows should then should necessarily be allowed by uh 32:19 Regulators in the past we have done for instance the truth machine at work you cannot use uh the truth machine to view 32:28 workers uh there is no no reason why we shouldn't ban 32:34 certain devices that allow an invasion of privacy that is St amount to that um 32:41 we need more attention to rights that go beyond privacy again occupational health and safety the kind of stress that you 32:48 get out of being monitored all the time out of being tracked all the time uh the 32:54 question of non-discrimination is also extremely important and in general uh I 33:01 will close by saying that we need a right to be human at the workplace we need a right to be free agents at 33:08 workplaces without technology that Proctors us and monitors and controls us 33:15 all the time as if we were robots and with that I stop and I thank you very much for your 33:22 attention thank you so much Valia for this um excellent talk and um a lot of food for thought um and 33:31 now I hand over to Maria Ian uh so please for for a 33:37 discussion um so thanks a lot y do you 33:43 hear me yeah and um just a couple of 33:48 remarks before I go to discuss Valerio excellent talk and I perceive my task 33:57 today as a sociologist of higher education and academic labor but also I'm a an academic on 34:04 strike today USU Scotland Scottish universities are on 34:09 strike and I decided to come here more in my activist role and more to speak 34:15 also from certain issues that have um made me quite aware within the academic 34:21 profession that need Collective action and Union bargaining and which I'm a 34:26 pray that currently academic unions do not pay enough attention to even if 34:32 there is already enough work about that so I think Valerio's emphasis on 34:39 regulation is very important but um what's interesting also 34:45 for me in the talk is the emphasis on non-coercive measures and discipline you 34:51 know and and I'm going to speak a bit about certain issues in the academic profession that perhaps we take for 34:56 granted sometimes and um I'm going to take a 35:01 higher education perspective in labor which is not very often the two things don't usually come together in 35:08 discussions and I'm going to speak about different functions of Education teaching research and 35:14 service um and we'll refer to some Concepts that he mentioned but also 35:19 we'll try to conceptualizing through social scientific lens so congratulate 35:24 think the workshops organizers because this you know bringing these two topics 35:30 together is not very frequent and um also because most recently so much um kind of effect has 35:38 been addressed towards students use of or overuse of AI in 35:44 plagiarism and it has pushed worries of institutional and tech companies using 35:49 AI to monitor survey profile and exclude our students as well even if today's talk is not about that but let's speak 35:57 about academic labor and and also let's first think what is it uh that AI helps 36:04 in academic labor when you don't think of students only so it is machine learning deep learning and natural 36:10 language processing that is augmenting our ability first in admin terms to process recruit recruitment admission 36:17 retention plagiarism performance data and automate decision making in terms of 36:23 teaching it is promoted as adaptive content commendations feedback assessments course syllabus lectures 36:30 assignments all can be written and then in terms of research it can write text and call and process large data set or 36:37 help us do that identify patterns build models recommend content the dark side as Valerio talk 36:45 has poly showed is the question of discipline and control not only on on 36:50 employees duties but also certain changes to working methods places of works and so forth that possesses 36:57 workers infringing upon human rights and dignity and benefits that allow for the 37:02 social reproduction of workers so I think social reproduction is quite important here to to take into account 37:07 so it's not just abstract rights it is the right to reproduce ourselves then alleged neutrality and 37:14 lack of bias real um and you know that kind of faces the reality of AI 37:22 replicating actual institutional racism sexism ableism class ISM and the 37:28 property rights over a product which are difficult to find when this product is 37:33 splitting to introduced to a chain of tasks and I'm going to come to that a bit 37:39 later um but you know let's think about the academic profession more broadly so 37:46 academic labor is still labor and um what happens importantly is that true 37:52 labor you know workers sell units of time that are not that that stops being there to capital 38:00 and capital uses new forms of technology to intensify performance per unit of 38:06 time so capital is interested in label power not in menow and the constant 38:12 availability of ever more intensive profitable low or non-maintenance labor 38:18 power is what it really needs and what it enhances through technology and AI is now on our race 38:26 there but I would take a more conservative position perhaps than Valerio I think it is temporary because 38:32 the new technology that um you know it is a new technology that replaces a 38:37 large biomass of Labor power but it is going to be replaced the moment 38:43 productivity standardizes and and becomes a norm and profit that that is generated through it becomes a norm and 38:49 then capital is going to flee and look for other investment and I think um you 38:54 know it is helpful to think back to the question of subsumption of subsumption of Labor in the case of formal 39:03 subsumption um in the sense of um you 39:08 know for instance how Weavers um have been um you know first Mo first their 39:16 activity has been monetized within the labor relation and then transformed and by now it is machine that does this work 39:24 so capital is in constant search for next Surplus expanding technology and technology is always 39:31 thought of as constantly framed as you know democratizing access and 39:36 expertise but we are living also in a market with artificially maintained differential value concentration in 39:43 certain positions of presti and power and of social dumping in others um and 39:49 there is a very complex polarization going on which is also expressed in the question of automation of jobs so um it 39:58 is very important in this way to think of what is it in a profession like 40:03 Academia which was supposed to be high skilled and not really automated about 40:09 that is now really changing where are we to be having this conversation from from 40:14 Academia and then the other thing that is changing is in the era of advanced capitalism as Richard and said the the 40:21 the corrosion of character or the linear Narrative of Life progression through social professional institutional field 40:29 um and we're speaking on the one hand of dis embeddedness of higher education from a set of social relations to an 40:36 abstract commodity or asset and the dis embodiedness on the other hand of the function it is 40:42 doing um and the temporal spatial Dimensions through the 40:48 digitalization so I think you are we are you know even if now ai has entered we 40:54 have to think about longer symbolic process of dis possession and it is 41:00 interesting to think what is it in the academic work that make you know that made us thing that it wouldn't be going 41:07 there and what what is it that is susceptible to this process so we have years spent in 41:14 upskilling and schooling and that is profiling with a Nar within a narrow disciplinary mode which also discipline 41:22 us in a very hierarchical structure of power and prestige between individuals 41:27 institutions National systems and so forth and it is very interesting when valer was speaking about um a machine 41:34 screening your CV I thinking why is it a problem that the machine screens my CV 41:39 but it's not a problem if it is a white privilege maale which usually happens in academic jurs 41:45 still um so so there is a there is something there within the question of 41:52 um this new technology that replicates a lot of old Technologies but it just 41:58 comes in a new guys secondly I think why we're scared about what's happening is 42:04 that um you know we we're thinking of this Skilling but we are already living 42:09 in a very in a stratified academic labor market where the Skilling is a fact we are living in a place where a big male 42:17 Professor manager usually manages the work of postdocs of teaching buyouts and 42:23 each one of them is ever more split into subtasks and some sub 42:29 professional um functions that are then easy to curate online and replace as we 42:36 have seen through the entry of atte in higher education through muks through online degrees and so for in which the 42:43 academic labor course has been outsourced um so a process like this is 42:49 already underway in high education and AI adds fuel to the fire and then there 42:55 is socialization of specific patterns that is the temporal and special functioning of um academic cycle so what 43:04 we are seeing is that academics are very much socialized in certain Cycles 43:12 certain types of events certain types of rituali of connecting with each other 43:17 and this is corroded with part-time fixed term zour contracts and is cored further with this Skilling and turning 43:25 our fun our profession into tasks that can more and more easily be curated and 43:32 replaced and this is especially now the case with teaching and we see if this if AI brings new this Skilling through 43:38 research in research as well and so in that sense we have to 43:45 think about where AI contributes by polarizing further the academic labor 43:51 force um technology is supposed to unburden academics from teaching in mean to be able to focus on research because 43:58 that's profitable now but is that really what it is doing we are more speaking 44:03 about the anihilation of time and place of on demand content accessible anytime from everywhere the unfettered flow and 44:12 transmission of knowledge as pure information content and um the 44:17 disembeddedness and the Skilling of teaching but also and this is important to think you know that that we are 44:24 speaking of a new labor market where um AI dictates both to us how we are 44:31 skilled and also how we skill our students you know so so there is dumping down also of content for certain Mass 44:39 audiences whereas the privileged audiences would still be exposed mostly to the integral um you know um asset of 44:50 the white male professor and then the other question is the alation that is 44:55 happen opening through Services microc credential n no degrees bite-size content so thinking that we are um 45:04 reducible to tasks rather than and and that profession should be U reduced to 45:11 marketable content and one that functions to smoothly um kind of perpetuate the 45:17 functioning of capital rather than um something that creates um a shock effect 45:23 if we are to use alterb or some kind of um friction that produces knowledge so 45:30 to finish you know it is interesting um you know not not to shed the tear about 45:36 the old privilege academic that is now um drifting apart is there something 45:41 that in this um new scare of AI we're kind of maybe forgetting and isn't there 45:50 more to think about what's to regulate within the academic profession what kind of capitals we have 45:56 and I think importantly to think also where is the next place where technology 46:02 will flly and how is it that we can regulate not just the relationship 46:07 between um company and users or workers but to regulate and harness Capital 46:14 because it will carry on on its flight for more Surplus extraction and ever 46:20 more Advanced Technologies and ever less investment in human Workforce so how do 46:26 we do that thank you thanks very much Maria for um many 46:34 many things you've opened um much appreciated Valeria would you like to say anything respond or shall we open 46:41 the floor for Q&A in general no I think we can open the floor I think yeah okay but I would like to 46:49 thank Maria of course for for the discussion thank you right Lee go ahead 46:56 oh right okay it's it seems that from what you've been saying legislation really isn't working or if it is it's 47:03 lagging well behind developments in AI so what is the solution for controlling 47:09 AI software to stop it being used to contravene human rights and rights to privacy what a worker is going to have 47:15 to do uh well outside of legislation the 47:21 only thing that you can do is to try and and mobilize and try to include your 47:26 bargaining strategies uh also things that have to do with the kind of control 47:32 that technology allows so resist the introduction of software that can lead 47:38 to uh increase monitoring and surveillance think that every new 47:44 software can incorporate certain function of monitoring that we don't think of imagine your outlook I don't 47:52 know I use Outlook and um sometime the status is green and then it turns yellow 47:59 after some time and then at some point it turns red all that can be used to say whether you were present or not at work 48:07 uh all these things that we don't think about can uh be used to collect data and 48:15 data that can then be used to discipline you so uh a first thing that needs to be 48:21 done is whenever new technologies are proposed and introduced is to ask what 48:26 they are actually going to be used for uh try to exclude the certain uses are 48:31 going to uh be uh are going to be done and the only thing you can do is through 48:38 again uh Collective voice and uh expressing Collective concern and also 48:44 including this in collective bargaining uh strategies and also Collective action 48:50 strategies that's the only other way that I see uh if legislation is Ling 48:55 behind as it is but that's only going to work where workers are in powerful 49:01 groups that is actually going to work only when workers are in powerful groups but again barring legislation and 49:08 Collective action I cannot think of anything else that 49:14 works right thank you thank you very much there is a actually a comment from 49:19 from Wayne but um Wayne if you're okay to come in and say something about that 49:25 I think that would be really relevant and 49:32 interesting um thank you um yeah it was just building on the conversation about 49:39 legislation um just to let people know about this work that's going on at the 49:44 Council of Europe um so it is a long process it is behind the developments in 49:51 AI for sure but at least it's attempting to do some things so it's a 49:56 recognition that the whole purpose of these tools um in education context is to 50:04 affect uh the developing cognition of the students um and therefore if you know we 50:12 we live in a world where we happily regulate um medicines before we use 50:18 those with people so the question is if these tools are designed to affect human 50:24 brains then we should be thinking about regulation for that as well but not just 50:29 focusing on the students but focusing on the academics um themselves um so the 50:36 meeting is next week if it goes through there then it'll be a probably a two-year process where based on lots of 50:43 evidence Gathering um so you know I completely agree and accept Valerio's um 50:50 perception here and I I agree that you know regulation on its own is 50:56 inadequate but I still believe that we need the regulation and even if the regulation is late and inadequate we 51:03 should still be pursuing it um so so that's that's what we're doing at the Council of 51:08 Europe can can I answer to that a second because I I want to be clear I think 51:14 regulation I think regulation is very useful and we need much more regulation 51:20 of these things uh including the form of legislation but also uh International 51:26 legislation and standard so what you're going to do at the Council of Europe is going to be extremely important uh I um 51:34 my my um my problem with regulation is a 51:40 problem with the current regulation that doesn't necessarily take enough into 51:45 account the risks that stem out of the systems as as you say uh as you do like 51:52 with medicine um you want to know what's in the medicine 51:58 and you you want to be sure that you're not going to kill people uh well this is not an assumption that Regulators do 52:04 when technology is involved H as I said at the at the end of my presentation 52:09 that the approach there is just techn deterministic this new technology is going to be fantastic it's going to 52:16 allow us to do a lot of things and then without thinking of all the kind of 52:22 consequences that I have lined in in the presentation and then Maria also built upon uh we are going to have Regulators 52:29 that say let's just allow it because you don't want to stifle Innovation well if Innovation is about reading eyelids uh 52:38 maybe we want to stifle that kind of innovation and so yes we absolutely need legislation on that and the legislation 52:43 we have is not enough can I just come back very quickly I I completely agree Valeria I think I 52:49 think you said I think one of the problems we have in the education space is that um like you were saying in your 52:56 presentation uh there is this belief that existing legislation addresses all the issues so the one we have all the 53:03 time is issues around data so gdpr but the point we're having to make and the 53:08 challenge we're having is getting the policy makers to recognize that yes data 53:14 and the way that that's regulated is fundamentally important but that's not it that's not the whole story it's about 53:20 things like agency empowerment it's about um choice of pedagogy about a huge 53:25 range of different things and we need to be thinking about those things as well so thank you very much um for your your 53:32 your talk I really found it useful thank you thank you great thank you so much 53:39 um well what what I was thinking as well is that now since the higher education 53:44 sector is more and more digitalized you know we academics are using various platforms virtual learning environments 53:50 for teaching even for on campus students emails and so on so forth um as we sort 53:57 of produce and use uh produce the content and use these platforms obviously we also produce user data so 54:05 what what we also noticed has emerged already in the sector are sort of new 54:10 products that are pitched at high education managers um and leadership as 54:16 saying um you can use this to also manage your academic labor and intro and 54:21 sort of provide measures for new kpis so you kind of have ai uh to complement 54:28 your work so virtual learning environments already um included services that would 54:34 automatically produce the structure of your modules or automatically produce some sort of basic structure but then at 54:41 the same time these algorithms are now seen that they can also be used to actually quite actively manage academic 54:49 labor and expectations as well so it's something that we see emerging um also 54:55 for people who have full employment in the sector not only for um you know 55:01 precarious outsourced uh um academic labor so I I think that's quite an 55:07 important development to watch um I'm not sure if you have anything to say about that Al and 55:13 Maria yeah I think that actually um this 55:19 technology uh will be used uh by management and supervisors and in the 55:25 academic world I think of uh de uh and everything that comes uh above them is 55:33 not gonna it's gonna really stratify the um the rest of the academic population 55:39 we'll have a a situation in which uh there will be no distinction if you are a lecturer if you are a professor if you 55:46 are a freelancer or not uh of course if the more precarious you are the more 55:51 you're going to be affected by this but the technology appli lies to everyone 55:57 and uh tend to um overmonitor tend to create stress tend to create 56:03 occupational health and safety risk discrimination risk and stifle Collective voice of everyone so uh this 56:12 is something that um is going to become a very powerful managerial tool in the 56:19 end of an uh of the administration of any kind of aeration 56:26 institution and uh this is why it is particularly important that um this is 56:32 perceived as the entire academic Workforce as something that they have to care 56:38 about thanks Maria did you want to yeah I mean I I was thinking again I'm I I 56:46 think there is um you know definitely a paril in that but I think we have to ask 56:52 the questions which is who is benefiting from that and how do we harness you know how do we basically try to regulate in a 56:59 way that those that especially you know profit from public higher education 57:04 can't profit that much because the reason why managers are interested in 57:10 this management is because there is a a kind of socialism for the rich going on with a lot of redistribution of public 57:16 resources through higher education into private hands so the question is where 57:21 where do we stop that and I don't think just regula think the use of certain technology in the hands of management is 57:27 where this is going to stop whereas I think it is important for our individual rights I also think that we are going to 57:34 pit it against each other in this struggle as well and I think once again I I totally agree with Valerio that it 57:40 is a matter of collective bargaining but we still have to tease out what exactly 57:46 are we bargaining for and I don't think we're there yet with the current legislation yeah thank you very much 57:53 also um good comment by Penny on on that issue as well since we are two minutes 57:59 uh to the clock I want to finish the webinar on time so thank you both so much for coming um we really appreciate 58:07 it um for everyone who attended the recording will be published uh fairly soon on our usual YouTube channel um and 58:16 we will finish this series on AI in high ucation on Thursday uh with a talk from 58:22 Christine o on chat GPT in higher education so thank you very much thank 58:29 you thanks a lot thank you
PY - 2023/9/19
Y1 - 2023/9/19
N2 - Video length: 58:34
AB - Video length: 58:34
KW - AI
KW - labour
KW - education
KW - academic identities
KW - intellectual property
KW - artifical intelligence
KW - webinar
M3 - Other contribution
ER -