EFF is suing the FBI through the Freedom of Information Act to obtain information on how it recruits Best Buy Geek Squad employees to report on illegal contents of devices they take in. This interest originates from a federal case in California, where Best Buy confirmed that members of its Geek Squad in Kentucky received compensation for reporting on customers who possessed child pornography on their devices. If the FBI is recruiting private industry employees to spy on personal computers, EFF argues, it constitutes an unlawful government search in violation of the Fourth Amendment. Relying on private vendors represents a means of accessing hidden data without the requirement to file a warrant, thus circumventing traditional protections for privacy. Best Buy has stated that the employees’ decision to accept payment goes against its policies. However, when you drop a device off at Geek Squad, you sign a document acknowledging that Best Buy will turn over devices containing child pornography to the FBI. Employees cannot search for such material; they instead must come across it while conducting the customer-requested service. Court documents from the California case demonstrate suspiciously close ties with the Geek Squad, referring to the employees as “sources.” It will be interesting to see what documents EFF’s FOIA suit uncovers regarding the cozy relationship between the FBI and private industry.
Source: SF Chronicle
Last Thursday, the FCC voted 2:1 to formally propose to eliminate net neutrality rules. Chairman Ajit Pai applauded the vote for putting “technologists and engineers” at “the center of the online world,” rather than “lawyers and accountants.” [This, despite overwhelming support by technologists and engineers for net neutrality.] [And this, despite the very probably involvement of lawyers and accountants in deciding which websites receive faster or slower access.] The approved proposal rescinds classification of broadband internet as a Title II telecommunications service. Commissioner Clyburn offered a dissenting statement, referring to the proposal as Destroying Internet Freedom and arguing that it contains a “hollow theory of trickle-down internet economics.” She refutes that removing regulations will in any way improve service to consumers. Though the proposal has been approved, it now enters a 3-month comment period before the FCC votes on whether to adopt the rule. This means there is more time to make specific comments on the proposal and defend net neutrality.
This observation from Scientific American clued me in to a part of privacy I hadn’t yet considered, possibly a final frontier for surveillance. We typically consider that, well, maybe I can’t always have privacy in public, privacy on my phone, privacy at home, or even privacy of my body, but at least I will always have privacy of my mind: no one can know what I am thinking.
This may no longer be true. Neuroimaging devices have apparently become so advanced that hidden intentions can be revealed, visual experiences can be decrypted, and dreams can be read with ever-increasing accuracy. Brain-computer interfaces (BCI) may also be used to modulate brain activity. In 2008, a woman in India was sentenced for life in jail because a brain scan suggested she knew something about a crime. Neurotechnology is also making its way into marketing research for companies like Google and Verizon to detect what people really think about their ads. This technology brings up a whole set of ethical questions about how much control we should have over the life of our minds, and brings with it new terms:
Cognitive liberty means that individuals can make “free and competent decisions regarding their use of neurotechnology.”
Mental privacy protects against third party intrusion into and unauthorized collection of brain data.
Mental integrity, protected by the EU Charter of Fundamental Rights, protects against “illicit and harmful manipulations of people’s neural activity.”
Psychological continuity means that people may enjoy continuity of their mental life from unconsented external alteration.
Some good definitions to keep in mind.
A release of documents by a group calling itself the Shadow Brokers demonstrates that the NSA has breached the SWIFT money transfer system by way of service providers in Latin America and the Middle East. This company bills itself as a worldwide provider of secure financial messaging. Vulnerabilities in outdated Windows servers and Cisco firewalls apparently contributed to the breaches directed by the NSA. In one instance, the NSA gained access to 9 servers at a Dubai-based contractor for SWIFT and was able to query the transactions.
MIT Technology Review reports that a team at the University of Beijing has developed an Emotional Chatting Machine, a chatbot that alters its response according to the emotion it detects from you, based on a field of study called sentiment analysis. In training this chatbot, the team used a dataset of 23,000 sentences collected from the Chinese blogging site Weibo, manually annotating the emotion of each sentence. A deep-learning algorithm is used to classify the emotional content of new text. For the chatbot responses, a conversation generator is used to produce content, which is then checked for correct emotional valence. The motivation? People are more likely to react positively to empathetic conversation partners who match their emotions. It is claimed that this chatbot would be useful in call centers, perhaps when consumers have service requests and contact companies online. But…should your computer know how you are feeling? Who will be collecting your emotional status? Is it manipulative to reinforce emotion via technology?
The latest release of CIA documents from WikiLeaks describes Grasshopper, the software used to combine malware building blocks to infect Windows operating systems for intelligence operations. Included among the released CIA manuals are steps for evading common antivirus programs, like Symantec and Microsoft Security Essentials, so that subjects of surveillance do not know their systems are infected. Some of the technology is borrowed from a notorious malware called Carberp, [suspected to be] developed by Russian organized crime for bank fraud.
Source: Ars Technica
For the first time, Microsoft has released a list of all the diagnostic data collected in Windows 10. The list is available here, and was updated today. This is likely in response to the shroud of privacy concerns surrounding the initial release of Windows 10 and rumors of unnecessary data collection, such as keylogging (recording the keystrokes you make on your computer). For instance, France last year ordered Microsoft to stop tracking Windows 10 users, such as recording which apps are installed and how long users spend on each. The list released today is very lengthy, so as computer experts dig into it, there should be more news to come on what specifically is being tracked. Microsoft reports that it has reduced the amount of data collected at its “Basic” diagnostic level to comply with the European Union’s General Data Protection Regulation.
Reported on The Verge