An awkward moment is becoming increasingly common in meetings, whether in person or online. Someone enters the room, switches on their laptop, and without saying a word, activates their AI bot to record and transcribe the conversation.
The rest of the attendees remain silent, unsure whether to voice their concerns or pretend nothing is happening. They wonder if they have enough authority to speak up or if it might negatively affect them later.
Welcome to the era of normalized corporate surveillance.
As is often the case, technology has outpaced the development of rules. AI assistants can record, transcribe, and analyze tone. They can also identify who speaks the most, make inferences about individuals, and even suggest responses in real-time. However, there are no established social norms for this technology.
Is it rude to activate a bot without warning? Where does that data go? Who else will see it? What will they do with it?
This is the classic paradox of technological disruption. The tool exists, it works, and it promises efficiency, but social norms struggle to keep up. It’s like the introduction of phones, when it took years to decide whether it was acceptable to answer calls in a movie theater or talk on the phone in an elevator. With advancements like Ray-Ban Meta, we face a similar situation.
However, the stakes are higher this time. The bot doesn’t just record. It interprets, analyzes, and stores information. It captures not only what is said but also how it’s said, when someone hesitates, and whom people seem to agree with or disagree with. All of this data is stored on the devices used, but also on servers of companies that already have too much information about us.
The solution won’t come from the technology itself but from us. We need to establish a clear etiquette quickly:
- Warn others before recording or activating the bot.
- Specify the purpose of the information gathered.
- Ask if anyone feels uncomfortable with this practice.
- Normalize company policies that restrict the use of AI bots in certain situations.
If we don’t take these steps, we risk normalizing the notion that any conversation can be transformed into data to be processed without our consent. That’s not a meeting anyone wants to attend.
Image | Memento Media
View 0 comments