[ad_1]
Language used for app tracking privacy and data usage settings causes consumer confusion.
Privacy and security features that aim to give consumers more control over sharing their data by smartphone apps are widely misunderstood, showing new research from the University of Bath’s School of Management.
43 percent of phone users in the study were confused or unclear about what app tracking or privacy settings mean. People commonly mistook the purpose of tracking, thinking that it was intrinsic to the app function or would provide a better user experience.
Companies use app tracking to deliver targeted advertising to smartphone users.
When iPhone users first open an app, a pop-up asks whether they want to allow the app company to track their activity across other apps. They can choose either ‘Ask App Not to Track’ or ‘Allow’, as introduced by Apple’s App Tracking Transparency framework in April 2021. Android users must access tracking consent via their phone settings.
If people opt out of tracking, their use of apps and websites on their device can no longer be traced by the company, and the data can’t be used for targeted advertising, or shared with data brokers.
The most common privacy-related misapprehension (24 per cent) was that tracking refers to sharing the physical location of the device – rather than tracing the use of apps and websites. People thought they needed to accept tracking for food delivery and collection services, such as Deliveroo, or for health and fitness apps, because they believed their location was integral to the functioning of the app.
While just over half of participants (51 per cent) said they were concerned about privacy or security – including security of their data after it had been collected – analysis showed no association between their concern for privacy in their daily life and a lower rate of tracking acceptance.
“We asked people about their privacy concerns and expected to see people who are concerned about protecting their privacy allowing fewer apps to track their data, but this wasn’t the case,” said Hannah Hutton, postgraduate researcher from the University of Bath’s School of Management.
“There were significant misunderstandings about what app tracking means. People commonly believed they needed to allow tracking for the app to function correctly.
“Some of the confusion is likely to be due to lack of clarity in wording chosen by companies in the tracking prompts, which are easy to misinterpret. For example, when ASOS said ‘We’ll use your data to give you a more personalised ASOS experience and to make our app even more amazing’ it’s probably no surprise that people thought they were opting for additional functionality rather than just more relevant adverts.”
Although the main text of the prompt for app tracking consent is standardised, app developers can include a sentence explaining why they are requesting tracking permission, and this can open the door to false or misleading information, either intentionally or unknowingly.
Other misconceptions included believing that consenting to sharing for health apps (such as period tracking apps) would mean private data being shared, or that denying tracking would remove adverts from the app.
The study, Exploring User Motivations Behind iOS App Tracking Transparency Decisions, is published in the proceedings of The ACM CHI Conference on Human Factors in Computing Systems and was presented at the CHI23 conference in Hamburg, Germany (23-28 April).
It is thought to be the first academic analysis of the decisions people make when faced with tracking requests.
The researchers collected data on the tracking decisions of 312 study participants (aged 18 to 75) and analysed reasons for allowing or rejecting tracking across a range of apps, including social media, shopping, health, and food delivery.
David Ellis, a Professor of Behavioural Science and co-author, added: “This research further exposes how most consumers are not aware of how their digital data is being used. Everyday millions of us share information with tech companies and while some of this data is essential for these services to function correctly, other data allows them to generate money from advertising revenue. For example, Meta predicted that they would lose $10Billion from people rejecting tracking.”
“While people are now familiar with the benefits of having PIN numbers and facial recognition to protect our devices, more work needs to be done so people can make transparent decisions about what other data is used for in the digital age.”
Source: University of Bath
[ad_2]
Source link