Subscribe: Center for Internet and Society - Privacy
http://cyberlaw.stanford.edu/taxonomy/term/3/feed
Preview: Center for Internet and Society - Privacy

Center for Internet and Society - Privacy



Privacy has become one of the defining issue of the Information Age.  CIS has received national recognition for its interdisciplinary and multi-angle examination of privacy, particularly as it relates to emerging technology.



 



Fake news busters

Fri, 15 Sep 2017 22:44:27 +0000

"As an outside technology adviser to Hillary Clinton’s failed 2016 U.S. presidential campaign, Scott has first-hand knowledge of how digital falsehoods can infiltrate — and, he would add, sway — an election.

Scott, who also served as a Clinton aide on technology when she was U.S. secretary of state, is part of a growing brigade of policymakers, national intelligence agencies and fact-checking agencies working to ensure the same thing does not happen in Germany as voters head to the ballot boxes.

“It’s my own personal regret that we didn’t understand the significance of this during the U.S. election,” said the soft-spoken American who has teamed up with Stiftung Neue Verantwortung, a digital think tank in Berlin, to combat potential online misinformation in the run up to the September 24 vote."

Date published: 
September 15, 2017
People: 
Related Topics: 



Privacy’s Blueprint: The Battle to Control the Design of New Technologies

Fri, 15 Sep 2017 19:58:10 +0000

September 19, 2017 7:00 pm

The Tech/Law Colloquium speaker for September 19, 2017 will be Woodrow Hartzog, a professor of law and computer science at Northeastern University, where he teaches privacy and data protection law, policy, and ethics. His recent work focuses on the complex problems that arise when personal information is collected by powerful new technologies, stored, and disclosed online.

Talk: Privacy’s Blueprint: The Battle to Control the Design of New Technologies

Abstract: Every day, Internet users interact with technologies designed to undermine their privacy. Social media apps, surveillance technologies, and the Internet of things are all built in ways that make it hard to guard personal information. And the law says this is okay because it is mainly up to users to protect themselves—even when the odds are deliberately stacked against them.

We should resist this state of affairs. Instead, the law should require software and hardware makers to respect privacy in the design of their products. Current legal doctrine treats technology as though it is value-neutral: only the user decides whether it functions for good or ill. But this is not so. Popular digital tools are designed to expose people and manipulate users into disclosing personal information.

Against the often self-serving optimism of Silicon Valley and the inertia of tech evangelism, privacy gains will come from better rules for products, not users. The current model of regulating use fosters exploitation. We must develop the theoretical underpinnings of a new kind of privacy law responsive to the way people actually perceive and use digital technologies. The law can demand encryption. It can prohibit malicious interfaces that deceive users and leave them vulnerable. It can require safeguards against abuses of biometric surveillance. It can, in short, make the technology itself worthy of our trust.

For more information visit: http://infosci.cornell.edu/techlaw-colloquium/woodrow-hartzog-september-...

Focus Area: 
Related Terms: 



The Undue Influence of Surveillance Technology Companies on Policing

Fri, 15 Sep 2017 19:53:51 +0000

September 12, 2017 7:00 pm

On Tuesday, September 12, the Tech/Law Colloquium welcomes Elizabeth Joh, a professor of law at the University of California-Davis. Professor Joh has written widely about policing, technology, and surveillance. Her scholarship has appeared in the Stanford Law Review, the California Law Review, the Northwestern University Law Review, the Harvard Law Review Forum, and the University of Pennsylvania Law Review Online. She has also provided commentary for the Los Angeles Times, Slate, and the New York Times.

Talk: The Undue Influence of Surveillance Technology Companies on Policing

Abstract: Conventional wisdom assumes that the police are in control of their investigative tools. But with surveillance technologies, this is not always the case. Increasingly, police departments are consumers of surveillance technologies that are created, sold, and controlled by private companies. These surveillance technology companies exercise an undue influence over the police today in ways that aren’t widely acknowledged, but that have enormous consequences for civil liberties and police oversight. Three seemingly unrelated examples -- stingray cellphone surveillance, body cameras, and big data software -- demonstrate varieties of this undue influence. These companies act out of private self-interest, but their decisions have considerable public impact. The harms of this private influence include the distortion of Fourth Amendment law, the undermining of accountability by design, and the erosion of transparency norms. This Essay demonstrates the increasing degree to which surveillance technology vendors can guide, shape, and limit policing in ways that are not widely recognized. Any vision of increased police accountability today cannot be complete without consideration of the role surveillance technology companies play.

For more information visit: http://infosci.cornell.edu/techlaw-colloquium/elizabeth-joh-september-12...

Focus Area: 
People: 



Illinois begins pilot project to put birth certificates on digital ledger technology

Thu, 14 Sep 2017 07:00:00 +0000

"Stanford Law School Center for Internet and Society Director of Privacy Albert Gidari agreed that blockchain is secure and Illinois’ move allows individuals to have better control over their government-issued ID.

“Who you are depends on who the government says you are,” Gidari said, “and this really changes that dynamic and gives you data portability.”

Gidari joked to not think of this as a dystopian future novel the likes of “A Brave New World,” but rather to think of it as a “better brave new world.”"

Date published: 
September 14, 2017
Focus Area: 
People: 
Related Topics: 



Can Cops Force You to Unlock Your Phone With Your Face?

Wed, 13 Sep 2017 07:00:00 +0000

"“When you put your fingerprint on the phone, you’re actually communicating something,” Albert Gidari, the director of privacy at Stanford University’s Center for Internet and Society, told me last year. “You’re saying, ‘Hi, it’s me. Please open up.’” That communication should be protected under the Fifth Amendment, just like a password, he said—and the same would hold for any other way of unlocking your phone using physical characteristics, including facial recognition."

Date published: 
September 13, 2017
Focus Area: 
People: 
Related Topics: 



This chatbot could help you sue Equifax

Tue, 12 Sep 2017 07:00:00 +0000

"But a word of caution comes by way of Ryan Calo, a privacy expert and law professor at the University of Washington. 

"You have to trust that the person who designed the bot knows what they're doing," Calo said. "A small error could invalidate whoever's using it, right?""

Date published: 
September 12, 2017
Focus Area: 
People: 
Related Topics: 



Prof Shows How Your Internet Activity Is Being Watched

Mon, 11 Sep 2017 07:00:00 +0000

"According to Narayanan, even without trackers, it is safe to conclude that anonymity does not exist on the internet. Narayanan’s group previously demonstrated that almost all browsing history can be de-anonymized and traced to specific users. According to Narayanan, Edward Snowden’s leaks on the U.S. government’s surveillance programs revealed that cookies — small pieces of information stored by a website on a user’s computer — can be used to tie that history back to specific people."

Date published: 
September 11, 2017
Focus Area: 
Related Topics: 



It’s about to get tougher for cops, border agents to get at your iPhone’s data

Mon, 11 Sep 2017 07:00:00 +0000

"Riana Pfefferkorn, a legal fellow at Stanford University, agreed. She also said that this new software design choice may not specifically be about enhancing Fifth Amendment protections and trying to frustrate police efforts.

"If a passcode is now required in order to sync a device with a new machine, that has practical utility for security purposes above and beyond the context of a device seized by law enforcement (for example if a thief, or an abusive partner, gets hold of the device while it's unlocked)," she e-mailed Ars. "Bear in mind also that if law enforcement's goal is to obtain a backup of the device, they can already serve legal process on Apple to get Apple to turn over the last device backup that was uploaded to iCloud (if that feature is turned on). That backup may not be super recent, however (as in the Syed Farook case).""

Date published: 
September 11, 2017
Focus Area: 



Communicating Cyber Intelligence to Non-Technical Customers

Thu, 07 Sep 2017 07:00:00 +0000

This article examines the challenge facing cyber intelligence analysts who have to explain threat information and analysis to non-technical consumers, like executives or law enforcement.  It explains why these challenges are common, but are often more pronounced in state and local government contexts.  Finally, it proposes a conceptual framework to think about the tradeoffs such analysts face, examines similar challenges in other policy areas, and offers strategies for communicating threat information effectively under various constraints.

Article available at http://www.tandfonline.com/doi/full/10.1080/08850607.2017.1297120

Focus Area: 
Related Topics: 
Author(s): 
Publication Type: 
Academic Writing
Publication Date: 
September 7, 2017



The evolving laws and rules around privacy, data security, and robots

Wed, 06 Sep 2017 07:00:00 +0000

"Every day we use countless digital devices and web services to shop, track our fitness, chat with friends, play games, check-in at stores and restaurants, you name it. While these activities are becoming increasingly essential in our digital society, they also can put our personal information at risk, says professor Woodrow Hartzog, whose research focuses on privacy, data protection, robotics, and automated technologies.

Hartzog, who joined Northeastern’s faculty this fall with joint appointments in the School of Law and the College of Computer and Information Science, focuses on the complex problems that arise when personal information is collected by powerful new technologies and then stored and disclosed online. “Information is power, and when other people collect it they have power over us and that leaves us vulnerable,” he says."

Date published: 
September 6, 2017
Focus Area: 
Related Topics: