Thomas Stoeckle, Neville Hobson and Sam Knowles in conversation on the history and function of PR, business and Big Data (and the ten Vs of Big Data), SNCR research initiative and survey into fake news, and more.
Amid congressional hearings and FBI investigations in the US about whether and how Russia interfered with the US Presidential Elections, discussions continue about the efficacy and ethics of micro-targeting voters. In our latest and 10th edition of the SmallDataForum podcast, Neville, Sam and I reflect on the outcome of the recent general elections in the UK.
In the latest episode of the #SmallDataForum podcast, Sam Knowles, Thomas Stoeckle, and Neville Hobson, subject the latest piece of Big News to their usual scrutiny. There’s lively debate about old vs new media, with the right-wing traditional media (particularly the press) apparently little more than an echo chamber of vitriol, as well as the fleet-footed use of social channels and influencers to target younger voters. The very younger voters who were largely ignored in 2016’s two seismic polls and whom traditional media finds harder and harder to touch.
SDF episode 9 discusses the challenges and opportunities of accelerating progress in the areas of machine learning and AI (which for us means augmented, rather than 'just' artificial intelligence). When it comes to permanent change driven by technological advancement, the genie is out of the bottle and it is too late to resist change. We need to get better at understanding it and living with it.
Another key subject are the big social media platforms and their roles as pure technological intermediaries, as opposed to taking on the responsibilities of publishers. This is a complex and controversial field, with a broad range of opinions. Robert Thompson, CEO of News Corp. and former editor of The Times, published a strongly worded editorial in the Times on 10th April where he claimed that "the two most powerful news publishers in human history have created an ecosystem that is dysfunctional and socially destructive". He calls for Authenticated Authenticity – verified provenance, accuracy, reality – as an asset of increasing value.
Our general, rather broad advice is to question everything. More specifically, seek out sources with a proven track record that you can trust, with authors that link to things they state – to allow verification
A topic that’s dominated our conversation in recent episodes of The Small Data Forum podcast is fake news and related issues.
In episode 7, hosted by Thomas Stoeckle in conversation with regulars Neville Hobson and Sam Knowles, we consider world wide web inventor Tim Berner-Lee’s call to action on what he sees as three big challenges for the web:
The second one in particular – spread of misinformation – offers another perspective on the fake news topic, part of the so-called post-Truth world, that speaks to a key aspect of this contemporary phenomenon: the dissemination of falsehoods and how can we address that. Thomas asks: Is it time for a new or updated Cluetrain manifesto? Cue lively discussion.
In their first SDF podcast of 2017, Neville, Sam and Thomas enjoy a wide-ranging conversation that looks at the issues of loss of trust in institutions, fake news and post-truth from the perspective of machine learning, psychology and personality mapping, political marketing, neurosciences, understanding audiences through better data, and ultimately how to tell more compelling stories with better data.
In episode 5 of the Small Data Forum podcast series hosted by LexisNexis – our Christmas and year-end edition – Neville Hobson, Sam Knowles and I reflect on fake news and their distribution networks, the alleged gaming of Google search rankings, the promise of augmented intelligence and broad questions of how civil societies deal with the emerging and evolving challenges. Do we need more regulation? And who will regulate the regulators?
In the last few weeks, there has been significant attention in mainstream and social media on
Even Pope Francis joined the debate by denouncing the slander and defamation through fake news as sin (link).
The debate following the surprising outcome of the US Presidential Elections has shifted from highlighting the shortcomings of political polling, to a thorough examination of the circumstances and conditions that led to the election of Donald Trump.
The media researcher and data journalism expert Jonathan Albright tested his hypothesis of a fake news ecosystem by crawling and indexing more than 300 websites known to be associated with fake news. He analysed more than 1.3m URLs and found what he described as a "micro propaganda machine".
Whilst it is possible to identify both 'left-wing', and 'right-wing' media ecosystems, the core of the matter is not about politics, but about trust in facts and accurate information. It is about the responsibility of stakeholders, including the large social networks such as Google, Facebook and YouTube.
But how will such a responsibility be defined? How will it express itself? Can the claim of algorithmic neutrality still be upheld, or should it be replaced with algorithmic accountability? Is this a question of morality and ethics, or rather one of regulation and the application of the Rule of Law? How do we weigh the risks of policing and censoring the internet against the dangers of a manipulated anarchic swamp of bigotry and hatred?
Listen to Neville, Sam and myself debating the various aspects of the current debate.
Recent expressions of democratic political will – the UK referendum on EU membership, the US presidential elections – have surprised most observers and commentators. Both outcomes, i.e. Brexit and Trump, were not what most of the polling data indicated. This episode of the Small Data Forum is asking whether we should and could have seen this coming.
Together with Sam Knowles of Insight Agents and Neville Hobson of IBM Social Consulting, I'm looking at some of the mechanisms at play: the psychology of predictions, the new phenomenon of fake news, echo chamber effects in the way people consume and share information, the way data was analysed and interpreted. Were the wrong questions asked? Are there better, more reliable ways of asking questions in order to get to more robust and reliable answers? And would those answers lead to more accurate predictions of outcomes?
As academics, professional communicators, political commentators and others are trying to put the recent developments in context, to understand the future of our information economy and ecology, the Small Data Forum will continue to highlight and explore some of the big and small issues that big and small data both raise, and address.
Episode 3 of the Small Data Forum podcast focuses on the growing influence of data analytics in professional and elite sports. With Olympic news still fresh, Neville Hobson, Sam Knowles and Thomas Stoeckle discuss how advances in gathering and analysing data are being used to give athletes a competitive advantage.
Episode 2 of the Small Data Forum podcast focuses on the role and use of data in the campaigns for the EU referendum vote in the UK. Neville Hobson, Sam Knowles and Thomas Stoeckle discuss the outcome of Brexit and the appearance of Regrexit; the so-called buyers' remorse of some leave voters, following an immediate change in the argument and presentation of facts, as well as data evidence of negative effects on currency, share prices.
How do you make Big Data less intimidating, more actionable and thus more valuable? That is the question at the heart of the Small Data Forum, an initiative by LexisNexis Business Insight Solutions to listen, learn, share and educate ourselves and others who grapple with the challenges of the information avalanche.
This inaugural episode includes reflections from industry thought-leaders Neville Hobson, Sam Knowles and host Thomas Stoeckle: