Every time a new field emerges, the question of ethics comes to the fore.
Ethics as science is ancient, as our world. And tedious – also as our world 🙂
But the curious thing in ethical concepts and principals is their flexibility to real-life conditions. Every time a new field or sphere arises on the horizon, ethics, as many-headed hydra, grows a new head. The same story had repeated with weapons, nuclear science, genetic biology, biochemistry, etc. Right now that powerful wave reaches out to the IT-sphere.
IT ethics – it’s the whole complicated field that deserves a separate article. But today we’re gonna talk about ethics in data scraping.
Big Data Is the New Black!
Everyone on the market scrapes, buys and sells consumer’s data, sometimes even without their permission to do that. (Cambridge Analytics is a vivid example). So it’s not surprising when users want to cut them off of social media and other annoying platforms. The ethics concept relating to data scraping appeared a few years ago – on the back of the Big Data trend and has already received many reviews, both positive and negative.
In today’s world every company that strives to be successful needs to buy and use customer data, AI concepts, and marketing strategies at the same time.
What Should Every User Do in Order to Protect Himself from Aggressive Data Scraping?
The Internet is full of instructions on how to correctly cut yourself out of Facebook (as it’s not as easy as it may seem) or how to use the Internet, avoiding Google services and not allowing it to collect data about you. In an attempt to get rid of their digital shadow, people even move to new places, switch to new jobs, use several different phones and laptops. This exciting game of cat and mouse with corporations attracts many enthusiasts around the world, following Edward Snowden, who unscrew cameras and microphones from their smartphones.
However, the current scale of gathering information about users and the approaching era of the Internet of Things, when even the laces on your shoes are connected to the network, means completely removing yourself from the Eye of Sauron will not work at all. Only dead people or hermits can’t leave the digital footprint. Now everything depends on how states and corporations will be able to agree among themselves on the treatment of our data.
Why Is It Dangerous to Leave Marks on the Internet
Dissidents, spies, criminals, and journalists – all of them are interested in the purity of their digital fingerprints and the anonymity of their actions on the Internet. Primarily because of thoughts about their own safety. For most ordinary users, a digital footprint does not carry such an obvious danger, and yet hundreds of thousands of people around the world are concerned with removing their digital shadow.
- Carelessly leaving a digital footprint compromises the security of even law-abiding users. Because of the digital trail, attackers can hack personal user accounts, get access to bank accounts, personal correspondence, and working data. Internet harassment, doxing, stalking – all of these dangerous practices are largely possible because of the “digital shadow” of the victim.
- A digital shadow forms a reality tunnel around the user that can restrict, dull and radicalize users.
- A digital shadow is the main source of information for corporations like Facebook and Google, which allows them to turn users into an expensive product and with incredible accuracy to manage the attention of millions of people in the interests of third parties.
In the digital world, we are becoming increasingly aware that every action becomes part of the global community, which we call the Internet – a place where algorithms combine different behavioral patterns into one common picture of humanity. In other words, it is clear for each of us that algorithms learn to recognize, classify and identify things that are important for everyone only because we behave in a certain way or prefer certain things.
When making decisions, the first things, that we advise are search engines, applications, and digital assistants. And they are very dependent on the digital shadow of our past choices. In this sense, we certainly have global responsibility at the individual level, even if our personal contribution is negligible. To deny this would be to deny a new transparent reality.
The more difficult it is for us to avoid big data collecting all the information about our digital entity, the more obvious it becomes that we are what we do. As more information becomes available in the ocean of big data about our choices, awareness of our responsibility for our contribution should increase.
The power to determine ourselves and all of humanity through a collective contribution to big data is in our hands. Of course, this is a very disturbing kind of freedom. But the increasing degree of anxiety before this responsibility is not so negative at all, considering how high the stakes are. As they say, with great authority comes great responsibility. And popular bloggers would love to add: great existential anxiety comes with great responsibility.
It’s always better to use and collaborate with highly aware of ethical data scraping companies, and the bright example of that is our dearly beloved LaSoft IT-company.