Anthropology professor discusses what we can learn from Sweden to protect our personal data
鈥淐U data cyberattack鈥 was the subject line that appeared in thousands of university-affiliated inboxes on Feb. 9, 2021. On that date, former CU President Mark Kennedy reported that individual records of students and employees may have been compromised.
CU was of this cyberattack on Accellion, a file-sharing service that CU used, which also hit organizations like Shell and the University of California鈥攁nd yet, it wasn鈥檛 even from the last year.听
As these attacks grow in frequency and complexity, the U.S. government is rushing to address the. A 麻豆影院 professor, though, suggests we look to Sweden for possible solutions while also maintaining a crucial element: the public鈥檚 trust.
Alison Cool, an assistant professor in anthropology, studies data law and practice, particularly in Sweden, and she says her research has potentially broader implications, namely ways to regulate data that balance the interests of citizens, consumers, corporations and governments.
Finding a definitive compromise for all stakeholders is not straightforward she says.
鈥淪weden had to grapple early on with the question of 鈥榟ow do we use data to create a better society while also protecting individuals from the potential harms that can come from the availability of this information?鈥欌 Cool said.听
She has been conducting research in Sweden since 2005. Initially, she planned to research women who gave birth to twins after in virtro fertilization (IVF) which led her to the largest twin registry in the world, the Swedish Twin Registry. Her research focus, however, shifted as she saw the large-scale national collection of personal data and what seemed like the public鈥檚 trust of scientific research and the state.
鈥淚n Sweden people have historically a lot of trust in science and they want to trust in the state,鈥 Cool said. 鈥淭he narrative is that it doesn鈥檛 feel threatening to have that information collected, because you feel very strongly that it鈥檚 going to benefit society, but the people who are telling me this are the scientists.鈥
However, this history of trust was called into question in 2010 when LifeGene, a biobank that planned to collect genomic data from half a million people, was launched in Sweden. The biobank was at the center of a public controversy about biomedical ethics and value creation in the age of big data.
鈥淲hen they started LifeGene, it put things in a new light because it was no longer this older model of citizens participating in scientific research and it helps the state,鈥 Cool continued. 鈥淲hen you have a public-private database with financially valuable data, it changes that kind of relationship of trust.鈥澨
Two published journal articles in and stemmed from Cool鈥檚 research in Sweden. She examined the relationship between the data use and data regulation, and the benefits and downfalls of both sides.听
In capitalism, data distribution is a large and profitable new market. 鈥溾 in this new market, better defined as the digital economy, and, as with oil, has political ramifications.听
鈥淭here is such a vast power imbalance between the parties (government and corporations between individuals) that are coming together,鈥 said Janet Ruppert, a graduate student studying information science at CU 麻豆影院. Ruppert was in Cool鈥檚 graduate seminar and Cool is a member of her PhD committee.
鈥淧eople think about elite programmers that are doing all of this stuff, but I would really point more to the company as an institution and the legal and economic sides. The business and legal side of the company are really the ones who are pushing for exploitation because that鈥檚 how you make it under capitalism.鈥
The data economy influences exploitative practices, but there are good intentioned data users. However, in a negative case, identifying illegal conduct is tough to pinpoint and prosecute.
Developing clear guidelines or regulations is difficult, though, because of the legal and cultural complexity of regulating a global data economy. As a result, 鈥渢here is a widespread sense of uncertainty about data law,鈥 said Cool in a discussing the European Union鈥檚 General Data Protection Regulation (GDPR).
The GDPR regulates organizations that are collecting data related to people in the European Union. Put into effect on May 25, 2018, organizations that violate its guidelines will endure financial penalties. 鈥淭he regulation itself is large, far-reaching and fairly light on specifics,鈥 the stated.
Cool interviewed scientists, data managers, legal scholars, lawyers, ethicists and activists in Sweden to understand their opinions on the GDPR. She found that scientists and data managers found the law 鈥渋ncomprehensible,鈥 meaning that that they felt was a lack of clarity and specificity. For example, if scientists worried that if they were to manipulate the data outside the law鈥檚 parameters unknowingly, they could find themselves in court.
Legal scholars and lawyers believed that the law鈥檚 loosely defined stipulations were not the result of poor legislation, but rather the fact that technology advances more quickly than does legislation.听听
Creating laws that consider ethics, data use and social norms as Cool explains, is difficult, given the varied interests of stakeholders. However, Cool argues that the GDPR, is 鈥渢he most comprehensive鈥攎ost people would say the best鈥攑rivacy law that we鈥檝e ever had.鈥
鈥淚 would say that Europe really set the standards for data privacy for the world and is really the world leader in how to best regulate personal data in the best interest of individuals,鈥 Cool said.
For companies, an ideal world would consist of complete open, free data use and distribution, but this world is not feasible with individuals valuing personal security. Balancing the goals of the institution with the values of the individuals continues to be the point of contention.听
鈥淓veryone has to find a place in between those two extremes where research can get done, but in a way that is secure,鈥 Cool said.
Ruppert, Cool and other CU data experts collaborated on an for CU employees and students affected by the data breach. The document recommends best practices for data security and privacy tools.
鈥淚 think it鈥檚 important that people understand that transparency (of how data is used) is a very low bar. If tech companies really want to be transparent, maybe they can actually start writing the terms of service and privacy policies to include relevant, specific information that would meet a reasonable standard of transparency,鈥 Ruppert said.
Developing strict and clear data regulations that allow flexibility on both sides of the spectrum is a tall order. Yet, as Cool concluded in a, 鈥淧ragmatic guidelines that make sense to people who work with data might do a lot more to protect our personal data than a law that promises to change the internet, but can鈥檛 explain how.鈥