Today, more and more data is getting generated every day. Along with this, the ability to analyse data is also being pushed beyond limits. With increasing ability to analyse data there is an increase in risk for collected data. That data which was not revealing anything earlier, all of a sudden starts revealing huge amount of information.
Most of the times, the data collected is provided by people who expect to be gaining something out of providing the data. The expectation is that the data provided would be responsibly used – not shared, not maliciously used. They would even expect that the data is appropriately destroyed after the intended use of the same.
This, essentially is a model based on trust where the data provider trusts the data consumer. It is of utmost importance for the consumer to ensure that the trust is not broken. Anyone developing a product based on data should ensure that the data is being treated as if it is his/her own data. The question we need to understand is – why would the provider share the data?
The answer to this question is something very very critical for the entire industry – It is because the provider trusts the consumer. How do you establish this trust? Merely saying “You can trust me” is not going to establish any trust.
There has to be a well-defined agreement between the provider and consumer about how the data would be collected, stored, accessed, used/including shared as well as destroyed.
How many times have we clicked on “I Agree” button to move forward? How many times do we actually read such agreements? Even if someone reads these agreements, is it worded such that it gives information in manner which is easily understood by the reader? Most of these agreements are ‘binary’ i.e. they consumer specifies certain condition to use the data and expects the provider to either agree/disagree. If you disagree, you are not able to use the services of the consumer. In a huge number of scenarios, the user does not have an option to disagree – for example, when I go to the hospital for treatment of a patient, do I have any option to say a ‘no’ to any of the forms that I am given to sign? It could have conditions like –
- They can share my data with insurance providers who can start calling me
- They could share my data with pharmacies
- They could be forcing me to buy medicines from a particular vendor
The hospital can use my data for any of such purposes. I have only 2 options – either to agree and get the patient treated, or to let go off treatment for my near and dear ones.
Does this establish a trust? ride-sharing apps need to collect location information about drivers and passengers to ensure the service is being delivered.
This makes sense in the moment of using the app. However, if the applies consent agreement allows location data to be collected regardless of whether or not the driver or rider is actually using the app, a user may be passively providing their location information without being actively aware of that fact. In such cases, the application may be inferring things about that passenger is interest in various goods or services based on the locations they travel to, even when they’re not using the app.
Given that location data may be moving through mapping APIs, or used by the app provider in numerous ways, a user has little insight into the real-time use of their data and the different parties with whom that data may be shared. For users of the ride-sharing app, this may cause concern that their location data is being used to profile their time spent outside the app information that could be significant if, for example, an algorithm determines that a driver who has installed the app is also driving with a competing ride sharing provider. Without clear consent agreements, interpretation of where data moves and how it is used becomes challenging for the user and can erode the trust that their best interests are being served.
Leave A Comment