Privacy consultants say brands they work with are already in breach of the old rules, let alone the new ones.
And don’t fall into the trap of believing it’s simply a case of updating the privacy policy, you also need to make sure that business processes reflect the policy, and that in turn could lead to system changes and additional costs.
Here’s what’s changing: If your software uses personal information to make significant decisions without human intervention about customers or prospects or consumers, you need to disclose that in your privacy policy. That’s the easy bit.
What’s more complicated and fraught is that you also need to ensure that your business processes reflect what you say in your policy, which may require your organisation to change the way its systems work, and how you collaborate with partners whose own lawyers probably want a quick word, or maybe a very long one with lots of warrants and sub-clauses.
Pull on the thread and the whole damn rug starts to unravel.
The privacy law grants your customer a right to action – basically, a valid reason to pursue legal proceedings based on a specific set of facts or circumstances that may have caused them harm or injury, something governments have been loathe to agree to in the past, say privacy experts.
Automated decision-making (ADM) was one of 30-plus areas of privacy in guatemala mobile phone numbers database the long-running conversation that the government initiated before the legislative update where it said its mind was made up and no further discussions were required. Many of the other items on the list have been pushed back to next year, but ADM made it through, which speaks to how it is viewed as a priority.
While definitions of the “significant harm” that a decision might visit upon a customer can already be discerned from the Australian case law, for now, there is no detailed definition in the legislation about an automated decision.
On the current reading of the legislation introduced to Parliament last week, it could be everything from those huge multimillion-dollar real-time decisioning ecosystems that Commbank, NAB, and ANZ are building on the back of Pega’s software, through to something as simple as a Java script tag on a web page that triggers a decision.
Industry leaders Mi3 spoke with offered a wide set of opinions about the current practices that could qualify as potentially rendering significant harm and which are already common in business applications today – accepting or denying credit card applications, using loyalty programs to offer differential experiences (“Welcome to the President’s lounge, Madam”), accepting or rejecting a job application based on AI analysis of a CV, or surge pricing based on the user’s behavioural profile as defined by the data you hold, including how panicky they get when the battery on their phone turns from reassuring green to OMG-red.
What you need to know
-
- Posts: 27
- Joined: Thu Dec 26, 2024 6:15 am