Microplastics are making their way into our food, drinking water and even the air. Robot managers are optimizing the autonomy out of many jobs and leaving employees stressed out and injured. Many job seekers are finding themselves vetted in part by artificial intelligence.
***
When microplastics enter the food chain.
Globally, we ingest an average of five grams of plastic every week—the equivalent of a credit card, according to a study commissioned by the World Wildlife Fund (WWF). This contamination comes from “microplastics” (particles smaller than five millimeters) that are making their way into our food, drinking water and even the air.
Around the world, people ingest an average of around 2,000 microplastic particles a week, the study reveals. These tiny particles can originate from a variety of sources, including artificial clothes fibers, microbeads found in some toothpastes, or bigger pieces of plastic that gradually break into smaller pieces when they’re thrown away and exposed to the elements. The particles make their way into rivers and oceans, where they can be eaten by fish and other marine animals, ending up as part of the food chain.
The full impact of plastic pollution remains unclear.
“There is very large uncertainty about the harms that plastics do,” said Professor Richard Lampitt of the UK’s National Oceanography Centre, who was not involved in the research. “Plastic is not a particularly harmful material; however, there is the potential that it does significant harm.”
Source: “You could be swallowing a credit card’s weight in plastic every week“
***
Robots make grueling taskmasters.
The last CM Risk Alert discussed the risks associated with working side-by-side with robots in warehouses. But what happens when the robot is the boss and not just a co-worker?
A new report from the Verge reveals that humans are being managed by intelligent machines in warehouses, call centers and other sectors, making their lives more stressful, grueling and dangerous as they seek to optimize the workday.
Robots are detecting “inefficiencies” in warehouses and forcing employees to work non-stop to meet quotas, leading to stress injuries. They’re managing software developers, monitoring their clicks and scrolls and docking their pay if they work too slowly. They’re listening to call center workers, telling them what to say and how to say it, and monitoring their level of empathy. Employees who work from home are being forced to download software that monitors their productivity in real time, tracking keystrokes, mouse clicks and the applications running and prompting them to stay on task if it detects they’ve become distracted or idle—even taking screenshots and webcam photos.
These automated systems can detect inefficiencies that a human manager never would—downtime after a call, lingering at the coffee machine between tasks, a new route that could help get a few more packages delivered in a day. But as employees lose their autonomy to optimization, their jobs are becoming more intense, stressful and dangerous.
Source: “How Hard Will the Robots Make Us Work?”
***
Job screening via AI.
New job seekers could be facing an additional barrier to employment next time they go on an interview: They may find themselves vetted in part by artificial intelligence.
Many businesses looking to fill internships and entry-level positions have turned to outside companies such as HireVue, a hiring intelligence platform that uses video interview software and AI algorithms to perform pre-hire assessments.
With HireVue, businesses can pose predetermined questions—often recorded by a hiring manager—that candidates answer on camera through a laptop or smartphone. Those videos are then pored over by algorithms analyzing details such as word usage, facial expressions and the tonality of the job applicant’s voice, trying to determine how likely a candidate is to possess a specific attribute a client is looking for in a certain job, such as empathy, tenacity, willingness to learn, good at working on a team, etc.
This gives the company offering the job an almost standardized-test-score view of the candidate, showing what they’re good at and how they stand relative to other candidates. A report can also be generated to give the candidate feedback, but it’s up to the employer to share that.
There is currently little in the way of regulations or industry standards surrounding disclosure of technology usage, so the interviewees may not know when (or how) AI is analyzing their interview, raising privacy concerns. Consider also, research findings reported by a university professor last year revealing some bias in emotional analysis AI technology, which assigns more negative emotions to black men’s faces than white men’s faces, and the prospects of employment practices risks begin to surface as well.
Source: “There’s a new obstacle to landing a job after college: Getting approved by AI”; “Racial Influence on Automated Perceptions of Emotions“