At PEAT, we are passionate about ensuring that workers across the country have access to the digital tools they need to succeed. Digital tools at work include everything from strong broadband connections to artificial intelligence (AI) that reduces bias in the hiring process. […]
Some employers report using surveillance tools because they fear that remote work lowers productivity. However, research consistently shows the opposite is true. The International Workplace Group found that 85% of businesses reported that offering remote options made their businesses more productive—with 67% estimating that it improved productivity by at least one-fifth.
Concerns about workplace surveillance are rising across federal and state governments.
Employers should exercise strong caution when using automated surveillance tools. They should develop best practices that limit surveillance through intentional centralized governance procedures that prioritize inclusion for people with disabilities and other underrepresented groups. Aside from legal compliance concerns, automated workplace surveillance could result in harmful organizational cultures and other undesirable outcomes.
People with disabilities and chronic health conditions are less likely to be employed due to systemic barriers, including workplace discrimination. They are also particularly vulnerable to the harms of automated surveillance, which can exacerbate barriers. When it comes to automated decision-making, research shows that data science predictions are often completely wrong for outlier groups like people with disabilities.
Employers are adopting new surveillance technologies to monitor and rank how employees move and behave on the job. However, this trend may create barriers for workers with disabilities and other underrepresented groups, undermining Diversity, Equity, Inclusion, and Accessibility (DEIA) goals. Surveillance technologies can result in negative workplace cultures and even cause legal issues for the employers who use them.