Bill English last year famously called his policy line “incremental radicalism”. Last week he did some “incremental incrementalism”. How will that run on “social media” in September?
“Radical” his superannuation qualifying age fix isn’t. No change till 2037 and then to 67 by 2040, though the 10-year residency requirement does go quickly out to 20 years.
Even politically, that is an incremental shift from John Key’s flat “no”.
Key came from a trade where if a swizz goes bad a year or two later that is someone else’s bother because “I’ll be gone, you’ll be gone”. Likewise for English and Steven Joyce if intergenerational tensions warm up in the mid-2020s.
English insists he takes a long view, for example in his fiscal caution targeting a low government net-debt level, his accent on building infrastructure (mainly roads and broadband) and his social “investment” programme aiming for long-term fiscal savings.
He would add housing, as he told me angrily last October — though from the outside that policy’s formation has looked more reactive than proactive.
He has not (yet?) matched Labour’s attempt to understand and develop policies to respond to the fast-changing nature of “work” (though officials have looked into it). What “work” is, how it is paid for or how people otherwise get incomes will play big in the 2020s.
But English was quick on to the potential mining metadata offers for more accurate social assistance.
Hence the demand by the Ministry for Social Development (MSD) that not-for-profits doing good work with some money from MSD must collect and hand over personal data of those they help.
How will those not-for-profits and “customers” (a favourite English term) fare if — when — MSD’s data custodianship fails?
So what? Isn’t privacy dead? (As some Wellington Collegians reminded us last week when their rape posts went viral.)
That is the thesis in a must-read new book by Andreas Weigend. “Data for the people” takes us deep into the collection, use and misuse and its positive potential, encapsulated in its subtitle: “How to make our post-privacy economy work for you”.
Weigend is an American from East Germany. Soviet agents imprisoned his father for six years because he could speak English. Weigend found he had a security file, too.
He writes from inside the data world, as a former chief scientist at Amazon, a physics professor at the prestigious Stanford University with his own Social Data Lab consultancy which works with, among others, dating websites.
Weigend knows a lot about “machine learning”: programming computers to analyse data for use by marketers, retailers, other companies — and politicians.
This is called “refining”: turning crude data into marketing fuel by processing the burgeoning information that can be, and is, collected about you.
Privacy, Weigend says, was a transient phase between the everyone-knew-everyone village and the metadata 2010s.
No matter how wary you are of “social media” and how little you voluntarily divulge on such as Facebook and in emails, an alarming amount is swiped off your smartphones, interactions with companies and clicks on web pages and from security cameras and many other sources.
“We are not going to stop creating data,” Weigend says. “The vast majority of us are not going to stop sharing data either.”
And he sees many potential positives, including early detection of latent health conditions.
But to ensure positives outweigh negatives, “efforts to ensure that data are for the people must focus not on trying to control our data at their … source but on seeing and exerting influence over the controls of the data refineries” — outfits like Facebook, Amazon, retailers and governments.
He wants rules requiring those “refineries” to be transparent about what they collect and what they do with it, rights to access one’s data, see a data safety audit, a privacy efficiency rating and a “return-on-data” score and rights to amend, blur, experiment with and “port” one’s own data.
That will need governments to cooperate — and the right people elected.
Facebook briefed journalists last week on the fast-proliferating channels it provides itself and with regular media for politicians to campaign and voters to question. Journalists, too, can use it in multiplying ways. Some do.
Campaigners — and companies helping them like Cambridge Analytica did for United States Republicans last year (though far less than it chest-thumpingly claimed) — can mine psychometric data that target messages to users of Facebook and other platforms to match their preferences and personalities. Already those platforms rank “news” on this basis, which helps embed biases.
Add in “bots” — automated social media accounts used to autonomously spread messaging (“astroturfing”), amplify allies’ messages and “roadblock” opponents’, including with “fake news”, news-like items with a wisp of truth but twisted and embroidered.
Enjoy your election. Superannuation? What’s that?