Day 4: 22 Mar 2018
Social media execs face panel on fake news
Facebook admits it should have told users earlier about breach of policy
The Straits Times, 23 Mar 2018
Facebook executives were grilled yesterday by members of the parliamentary Select Committee on deliberate online falsehoods.
Law and Home Affairs Minister K. Shanmugam took the tech giant to task for not telling its users soon after it found out their data was breached by political consultancy firm Cambridge Analytica.
He said Facebook fell short of its own pledges on transparency, which calls into question whether it can be relied on to fight false news on its own.
Facebook Asia-Pacific vice-president of public policy Simon Milner admitted it did wrong in withholding the information, but said steps have been taken to fix the problem, and it is serious about cooperating with governments to fight disinformation.
The company, with Twitter and Google, had suggested new legislation was not necessary in Singapore. But the committee said other experts at earlier hearings had pointed out gaps in existing laws.
Facebook chief executive Mark Zuckerberg has acknowledged the company should have done better in handling the breach. Outraged US and British lawmakers have opened investigations into the scandal that came to light in the past week.
Facebook admits it should have told users earlier about breach of policy
The Straits Times, 23 Mar 2018
Facebook executives were grilled yesterday by members of the parliamentary Select Committee on deliberate online falsehoods.
Law and Home Affairs Minister K. Shanmugam took the tech giant to task for not telling its users soon after it found out their data was breached by political consultancy firm Cambridge Analytica.
He said Facebook fell short of its own pledges on transparency, which calls into question whether it can be relied on to fight false news on its own.
Facebook Asia-Pacific vice-president of public policy Simon Milner admitted it did wrong in withholding the information, but said steps have been taken to fix the problem, and it is serious about cooperating with governments to fight disinformation.
The company, with Twitter and Google, had suggested new legislation was not necessary in Singapore. But the committee said other experts at earlier hearings had pointed out gaps in existing laws.
Facebook chief executive Mark Zuckerberg has acknowledged the company should have done better in handling the breach. Outraged US and British lawmakers have opened investigations into the scandal that came to light in the past week.
Cambridge Analytica scandal: How Facebook data helped Trump win over voters
The Straits Times, 22 Mar 2018
WASHINGTON • It was one of hundreds of cute questionnaires that were shared widely on Facebook and other social media platforms, like "Which Pokemon are you?" and "What are your most used words?".
This one, an app called "thisismydigitallife", was a personality quiz, asking questions about how outgoing a person is, how vengeful one can be, whether one finishes projects, worries a lot or is talkative.
About 320,000 people took the quiz, designed by a man named Alexsandr Kogan, who was contracted to do it by Cambridge Analytica, founded by United States Republican supporters including Mr Steve Bannon, who would become the strategist for Mr Donald Trump.
As Dr Kogan's app was circulated via Facebook, it reaped far more than just the information on those who took the test.
The Straits Times, 22 Mar 2018
WASHINGTON • It was one of hundreds of cute questionnaires that were shared widely on Facebook and other social media platforms, like "Which Pokemon are you?" and "What are your most used words?".
This one, an app called "thisismydigitallife", was a personality quiz, asking questions about how outgoing a person is, how vengeful one can be, whether one finishes projects, worries a lot or is talkative.
About 320,000 people took the quiz, designed by a man named Alexsandr Kogan, who was contracted to do it by Cambridge Analytica, founded by United States Republican supporters including Mr Steve Bannon, who would become the strategist for Mr Donald Trump.
As Dr Kogan's app was circulated via Facebook, it reaped far more than just the information on those who took the test.
At the time, in 2015, such apps could scrape up all the personal details of not only the quiz-taker, but also all their Facebook friends.
That ultimately became a horde of data on some 50 million Facebook users - their personal information, their likes, their places, their pictures and their networks.
Marketers use such information to pitch cars, clothes and vacations with targeted ads. It was used in earlier elections by candidates to identify potential supporters.
But for Dr Kogan and Cambridge Analytica, it was a much bigger goldmine. They used it for psychological profiling of US voters, creating a powerful database that reportedly helped carry Mr Trump to victory in the 2016 presidential election.
The British-based political consultancy firm says its work with data and research allowed Mr Trump to win with a narrow margin of "40,000 votes" in three states, providing victory in the electoral college system despite losing the popular vote by over three million votes, according to Slate, an online magazine.The data let the Trump campaign know more than perhaps anyone has ever known about Facebook users, creating targeted ads and messaging that could play on their individual biases, fears and loves - effectively creating a bond between them and the candidate.
The data collected via Dr Kogan's app generated an incredible 4,000 or more data points on each US voter, according to Mr Alexander Nix, Cambridge Analytica's chief executive before he was suspended on Tuesday. The output was put to work in what Mr Nix called "behavioural micro-targeting" and "psychographic messaging".
The data collected via Dr Kogan's app generated an incredible 4,000 or more data points on each US voter, according to Mr Alexander Nix, Cambridge Analytica's chief executive before he was suspended on Tuesday. The output was put to work in what Mr Nix called "behavioural micro-targeting" and "psychographic messaging".
Simply put, the campaign could put out messages, news and images via Facebook and other social media platforms that were finely targeted to press the right buttons on an individual that would push him into Mr Trump's voter base.
For Mr Trump, it worked.
AGENCE FRANCE-PRESSE
For Mr Trump, it worked.
AGENCE FRANCE-PRESSE
Doubts whether social media firms can help to fight fake news: K. Shanmugam
Data breach scandal shows Facebook has fallen short on transparency: Minister
By Tham Yuen-C, Senior Political Correspondent, The Straits Times, 23 Mar 2018
The conduct of Facebook in the data breach involving Cambridge Analytica gives the Government reason to question whether the social network can be trusted to cooperate in the fight against online falsehoods, said Home Affairs and Law Minister K. Shanmugam on Thursday (March 22).
On the fourth day of the Select Committee's hearings, representatives from Facebook as well as Twitter and Google were asked about their statements and actions, as the committee looked to their track record to determine if they will be reliable partners in countering fake news. In their submissions to the committee, the firms said there are enough laws in place to tackle the problem in Singapore, without new legislation.
Facebook vice-president of public policy for Asia-Pacific Simon Milner found himself in the hot seat, with the hearings here happening just days after revelations that political consultancy firm Cambridge Analytica had exploited the private information of 50 million Facebook users.
Dwelling for some time on this matter, Mr Shanmugam said it was his view that Facebook had fallen short of its professed standards of transparency in handling user data.
The data breach took place in 2014 and Facebook knew about it in 2015, but it was not until a few days ago that it admitted to it.
Mr Milner, who has given evidence to Parliaments in three other countries, had told British MPs last month that Cambridge Analytica did not have Facebook data.
Pointing to this, Mr Shanmugam said he should have come clean about the matter. The fact that he did not made it reasonable to conclude that Facebook had deliberately sought to mislead the British Parliament, as chair of the committee Damian Collins had suggested, the minister said.
Rejecting this characterisation vigorously, Mr Milner said he had answered truthfully based on the information he had at that point.
But he conceded that others could leave with the same impression as Mr Collins, and admitted he should have given a fuller answer in hindsight.
At one point on Thursday, Mr Milner questioned the relevance of various news articles and statements by Facebook executives he was being asked to comment on and looked to committee chairman Charles Chong to step in.
He added the overall impression created is of a firm that does not care about the problem and is not doing anything to address it, when in fact Facebook has made huge investments in the area and will double the number of people working on security to 20,000 by the year end.
"Please don't just read articles like that," he said, urging the committee to consider a more complete range of information, including the social network's paper on information operations, and comments by its chief Mark Zuckerberg.
To this, Mr Shanmugam said Facebook's conduct elsewhere and expert opinions of the firm are highly relevant in figuring out if the firm would be voluntarily helpful or if the Government would have to intervene, such as through legislation.
Data breach scandal shows Facebook has fallen short on transparency: Minister
By Tham Yuen-C, Senior Political Correspondent, The Straits Times, 23 Mar 2018
The conduct of Facebook in the data breach involving Cambridge Analytica gives the Government reason to question whether the social network can be trusted to cooperate in the fight against online falsehoods, said Home Affairs and Law Minister K. Shanmugam on Thursday (March 22).
On the fourth day of the Select Committee's hearings, representatives from Facebook as well as Twitter and Google were asked about their statements and actions, as the committee looked to their track record to determine if they will be reliable partners in countering fake news. In their submissions to the committee, the firms said there are enough laws in place to tackle the problem in Singapore, without new legislation.
Facebook vice-president of public policy for Asia-Pacific Simon Milner found himself in the hot seat, with the hearings here happening just days after revelations that political consultancy firm Cambridge Analytica had exploited the private information of 50 million Facebook users.
Dwelling for some time on this matter, Mr Shanmugam said it was his view that Facebook had fallen short of its professed standards of transparency in handling user data.
The data breach took place in 2014 and Facebook knew about it in 2015, but it was not until a few days ago that it admitted to it.
Mr Milner, who has given evidence to Parliaments in three other countries, had told British MPs last month that Cambridge Analytica did not have Facebook data.
Pointing to this, Mr Shanmugam said he should have come clean about the matter. The fact that he did not made it reasonable to conclude that Facebook had deliberately sought to mislead the British Parliament, as chair of the committee Damian Collins had suggested, the minister said.
Rejecting this characterisation vigorously, Mr Milner said he had answered truthfully based on the information he had at that point.
But he conceded that others could leave with the same impression as Mr Collins, and admitted he should have given a fuller answer in hindsight.
At one point on Thursday, Mr Milner questioned the relevance of various news articles and statements by Facebook executives he was being asked to comment on and looked to committee chairman Charles Chong to step in.
He added the overall impression created is of a firm that does not care about the problem and is not doing anything to address it, when in fact Facebook has made huge investments in the area and will double the number of people working on security to 20,000 by the year end.
"Please don't just read articles like that," he said, urging the committee to consider a more complete range of information, including the social network's paper on information operations, and comments by its chief Mark Zuckerberg.
To this, Mr Shanmugam said Facebook's conduct elsewhere and expert opinions of the firm are highly relevant in figuring out if the firm would be voluntarily helpful or if the Government would have to intervene, such as through legislation.
It was only fair for Facebook to be given the opportunity to respond, if this information could help with the Government's decision later.
He said it was his view that the company had not behaved responsibly thus far, adding that "we will have to wait and see what you do".
Mr Milner disagreed but said it was fair to hold Facebook to account: "This has been a tough Q&A, I respect that you are asking questions that need to be answered and we as a company need to be accountable to you and your colleagues and other policymakers, most importantly, the community of 2.2 billion, including some 4.1 million people here in Singapore, about how we protect their data, how we keep it secure and when things go wrong, how we tell them and you about it."
Another reason to dwell on what Facebook has done elsewhere is that online falsehoods could affect national security, and the meddling in elections in the US could well happen here, said Mr Shanmugam.
"We know our position as Singapore in the world. We are not the United States of America. If a very senior legislator in the US feels that you are not being cooperative, then how do we expect that you will cooperate with us? But these are issues that we are entitled to explore," he said.
Singapore wants technology companies to succeed and considers them partners in the fight against online falsehoods, but it did not mean the Government has to accept that their claims of what they can or have done is enough to fix the problem, he added.
SHANMUGAM ON THE RELEVANCE OF HIS LINE OF QUESTIONING
The Straits Times, 23 Mar 2018
Here are edited extracts from the lengthy exchange between Law and Home Affairs Minister K. Shanmugam and Facebook vice-president of public policy for Asia-Pacific Simon Milner.
Mr Milner:"This committee is looking into the issue of deliberate online falsehoods here in Singapore. Myself and my colleague and other people on this panel have come here prepared to answer questions about and to help the committee understand it.
"I don't think it is fair to ask me detailed questions about evidence given by my colleague to a different Parliament in a different country about activities associated with that country... I am really trying to understand why we aren't talking about the issues in Singapore, about the deliberate online falsehoods here, about what our companies are doing about this... I really respectfully suggest... if you want to get to something, get to it, and let's have other people answer some questions."
Mr Shanmugam: "The questions before the UK Parliament were very relevant in exploring the degree to which you can be trusted, Facebook can be trusted to answer questions when asked, Facebook can be trusted to be a reliable partner, that a government of Singapore can depend on Facebook to tell us the truth, the whole truth and nothing but the truth in proceedings where the witnesses are sworn, or whether you will do everything you can to give lawyers' answers or lawyered answers.
"As I told you earlier, one looks at the sequence of conduct from 2015 to 2018, and the very first time you accept the responsibility for Cambridge Analytica publicly, when did that happen and why did that not happen earlier?
"And to what extent can we take seriously all these protestations that you can be completely trusted to apply your internal guidelines? It is very relevant.
"And if you thought that you could turn up here today, not answer questions on Cambridge Analytica and explain your answers today with your answers less than five weeks ago to a different Parliament - we are all sovereign Parliaments, but we look at your conduct all around the world and we have to understand.
"Second, why are we looking at these answers? We are looking at our national security, the consequences we have.
"By looking at your answers elsewhere, it is clear and you have confirmed you will not decide whether something is true or false, you will not take down something simply because it is false.
"You will take it down if there is a legal obligation on you and your argument, up to very recently, through the written representations, through the public statements, through all public positions that you have taken, in essence is that you will prefer to be regulated yourself with your internal guidelines - that is my sense of it, if I am wrong, I am wrong - and that you do not want to be regulated."
READ OUR EVIDENCE FIRST, SAYS FACEBOOK
Mr Milner responding to Mr Shanmugam, who had asked for his comments on statements made in various articles on Facebook:
"I agree, if you were just to read all those articles, I would be really worried about Facebook if I just read those articles.
"But please don't just read articles like that, also read our evidence, read our information operations paper, read the whole of (Facebook general counsel Colin Stretch's) evidence to that committee, read the commitments by our CEO Mark Zuckerberg... I can't imagine any reasonable person who could read all of that and conclude that we can't work with or trust this company."
"We know our position as Singapore in the world. We are not the United States of America. If a very senior legislator in the US feels that you are not being cooperative, then how do we expect that you will cooperate with us? But these are issues that we are entitled to explore," he said.
Singapore wants technology companies to succeed and considers them partners in the fight against online falsehoods, but it did not mean the Government has to accept that their claims of what they can or have done is enough to fix the problem, he added.
SHANMUGAM ON THE RELEVANCE OF HIS LINE OF QUESTIONING
The Straits Times, 23 Mar 2018
Here are edited extracts from the lengthy exchange between Law and Home Affairs Minister K. Shanmugam and Facebook vice-president of public policy for Asia-Pacific Simon Milner.
Mr Milner:"This committee is looking into the issue of deliberate online falsehoods here in Singapore. Myself and my colleague and other people on this panel have come here prepared to answer questions about and to help the committee understand it.
"I don't think it is fair to ask me detailed questions about evidence given by my colleague to a different Parliament in a different country about activities associated with that country... I am really trying to understand why we aren't talking about the issues in Singapore, about the deliberate online falsehoods here, about what our companies are doing about this... I really respectfully suggest... if you want to get to something, get to it, and let's have other people answer some questions."
Mr Shanmugam: "The questions before the UK Parliament were very relevant in exploring the degree to which you can be trusted, Facebook can be trusted to answer questions when asked, Facebook can be trusted to be a reliable partner, that a government of Singapore can depend on Facebook to tell us the truth, the whole truth and nothing but the truth in proceedings where the witnesses are sworn, or whether you will do everything you can to give lawyers' answers or lawyered answers.
"As I told you earlier, one looks at the sequence of conduct from 2015 to 2018, and the very first time you accept the responsibility for Cambridge Analytica publicly, when did that happen and why did that not happen earlier?
"And to what extent can we take seriously all these protestations that you can be completely trusted to apply your internal guidelines? It is very relevant.
"And if you thought that you could turn up here today, not answer questions on Cambridge Analytica and explain your answers today with your answers less than five weeks ago to a different Parliament - we are all sovereign Parliaments, but we look at your conduct all around the world and we have to understand.
"Second, why are we looking at these answers? We are looking at our national security, the consequences we have.
"By looking at your answers elsewhere, it is clear and you have confirmed you will not decide whether something is true or false, you will not take down something simply because it is false.
"You will take it down if there is a legal obligation on you and your argument, up to very recently, through the written representations, through the public statements, through all public positions that you have taken, in essence is that you will prefer to be regulated yourself with your internal guidelines - that is my sense of it, if I am wrong, I am wrong - and that you do not want to be regulated."
READ OUR EVIDENCE FIRST, SAYS FACEBOOK
Mr Milner responding to Mr Shanmugam, who had asked for his comments on statements made in various articles on Facebook:
"I agree, if you were just to read all those articles, I would be really worried about Facebook if I just read those articles.
"But please don't just read articles like that, also read our evidence, read our information operations paper, read the whole of (Facebook general counsel Colin Stretch's) evidence to that committee, read the commitments by our CEO Mark Zuckerberg... I can't imagine any reasonable person who could read all of that and conclude that we can't work with or trust this company."
Facebook probing if data breach affected Singapore users
By Seow Bei Yi, The Straits Times, 23 Mar 2018
Facebook is investigating if any of its Singapore users' personal information was inappropriately obtained and shared with British political consultancy Cambridge Analytica.
Affected users will be informed, the tech giant's Asia-Pacific vice-president of public policy, Mr Simon Milner, said yesterday at a Select Committee hearing on deliberate online falsehoods.
He also said Facebook is looking into whether there are other data breaches involving app developers.
His remarks come in the wake of recent media reports about Cambridge Analytica, which is at the centre of a scandal in which it is accused of exploiting the data of more than 50 million Facebook users for commercial and political use.
Cambridge University researcher Aleksandr Kogan had used an app to extract the users' information. By allegedly accessing user profiles, the firm could infer the political preferences of United States voters and target personalised messages at them to benefit Republican candidate Donald Trump.
Home Affairs and Law Minister K. Shanmugam repeatedly questioned Mr Milner about Cambridge Analytica yesterday, in a lengthy and, at times, heated exchange.
Mr Milner conceded at one point that Facebook "got it wrong" and should have informed users about the data breach, noting that it had a "moral obligation" to do so.
He echoed Facebook chief executive Mark Zuckerberg's post yesterday which admitted there was "a breach of trust between Facebook and the people who share their data with us and expect us to protect it".
Asked by Mr Shanmugam yesterday if Facebook could be expected to have done more in ensuring the data Cambridge Analytica took inappropriately had been deleted, Mr Milner said: "Yes, given the actions we are now taking."
Mr Shanmugam also questioned if Mr Milner had been "careful and economical" with the truth when he told British MPs last month at a Select Committee inquiry into fake news that Cambridge Analytica did not have Facebook data.
To this, Mr Milner stressed repeatedly that his answers were accurate based on what he knew at the time.
The consultancy had given Facebook a sworn affidavit saying it had no Facebook data, Mr Milner said. He later conceded that in hindsight, he should have "provided a fuller answer to the committee and made them more aware of what we understood to be true".
Mr Shanmugam also asked why Facebook did not verify the certification from Dr Kogan that he and Cambridge Analytica had deleted the data it obtained.
Mr Milner replied: "That is one of the lessons for us, in terms of why we are now going to audit all other apps and not just take their affirmation... that they have deleted data or not passed it on."
By Seow Bei Yi, The Straits Times, 23 Mar 2018
Facebook is investigating if any of its Singapore users' personal information was inappropriately obtained and shared with British political consultancy Cambridge Analytica.
Affected users will be informed, the tech giant's Asia-Pacific vice-president of public policy, Mr Simon Milner, said yesterday at a Select Committee hearing on deliberate online falsehoods.
He also said Facebook is looking into whether there are other data breaches involving app developers.
His remarks come in the wake of recent media reports about Cambridge Analytica, which is at the centre of a scandal in which it is accused of exploiting the data of more than 50 million Facebook users for commercial and political use.
Cambridge University researcher Aleksandr Kogan had used an app to extract the users' information. By allegedly accessing user profiles, the firm could infer the political preferences of United States voters and target personalised messages at them to benefit Republican candidate Donald Trump.
Home Affairs and Law Minister K. Shanmugam repeatedly questioned Mr Milner about Cambridge Analytica yesterday, in a lengthy and, at times, heated exchange.
Mr Milner conceded at one point that Facebook "got it wrong" and should have informed users about the data breach, noting that it had a "moral obligation" to do so.
He echoed Facebook chief executive Mark Zuckerberg's post yesterday which admitted there was "a breach of trust between Facebook and the people who share their data with us and expect us to protect it".
Asked by Mr Shanmugam yesterday if Facebook could be expected to have done more in ensuring the data Cambridge Analytica took inappropriately had been deleted, Mr Milner said: "Yes, given the actions we are now taking."
Mr Shanmugam also questioned if Mr Milner had been "careful and economical" with the truth when he told British MPs last month at a Select Committee inquiry into fake news that Cambridge Analytica did not have Facebook data.
To this, Mr Milner stressed repeatedly that his answers were accurate based on what he knew at the time.
The consultancy had given Facebook a sworn affidavit saying it had no Facebook data, Mr Milner said. He later conceded that in hindsight, he should have "provided a fuller answer to the committee and made them more aware of what we understood to be true".
Mr Shanmugam also asked why Facebook did not verify the certification from Dr Kogan that he and Cambridge Analytica had deleted the data it obtained.
Mr Milner replied: "That is one of the lessons for us, in terms of why we are now going to audit all other apps and not just take their affirmation... that they have deleted data or not passed it on."
Political ads: Facebook may curb foreign currency use
By Ng Jun Sen, Political Correspondent, The Straits Times, 23 Mar 2018
Tech giant Facebook is prepared to consider banning foreign currencies from being used to pay for Singapore political advertising on its platform, its vice-president of public policy for Asia-Pacific, Mr Simon Milner, said yesterday.
"We do not as yet have that policy," he added, when responding to Law and Home Affairs Minister K. Shanmugam at the Select Committee hearing on deliberate online falsehoods. He agreed with Mr Shanmugam that Facebook should take actions to ensure only people based in the country can buy ads meant for the country, and said he expects the currency measure to be part of "the range of things that we do to preserve the integrity" of elections.
It, however, will be applied to countries that forbid foreign influence in domestic politics such as Singapore, he said.
Facebook's inability to trace sources of funding for political ads has come under fire globally, following revelations that tech giants - including Twitter and Google - could have been exploited by Russia to sway voters. Facebook, in particular, had accepted political ads in the 2016 US presidential election that were paid for in Russian roubles.
Asked about it at a US Senate judiciary hearing last year, Facebook's vice-president and general counsel Colin Stretch did not commit to banning political ads bought in foreign currencies, saying that currencies did not necessarily indicate the source country of an ad.
Mr Milner said: "It is really difficult to define what is a political ad. Most of the ads, in the case of the US, may not have been classified as political ads under the jurisdiction there because they did not endorse a candidate."
Responding, Mr Shanmugam said: "So, if we define what is a political ad for you, then you will do it?"
Yes, said Mr Milner. He later said the type of political ad would have to be narrowly defined, as a move against foreign currency payments may affect non-state organisations.
"Remember, we have international organisations, human rights organisations, that will often seek to advertise on our platform to highlight human rights abuses," he said, adding that he meant other countries.
Highlighting Facebook's recent efforts, he said his company is piloting a "transparency exercise" in Canada, and plans to use it in the US mid-term elections later this year.
It requires Facebook page owners and administrators who want to put up ads to verify their details in a note sent by snail mail to Facebook.
"It will be available globally. By shining a spotlight, we know that is often the best antidote to malfeasance and bad behaviour, and that is what we will be doing," he added.
Representatives from Google, and Twitter set out steps they took to improve their policies and add new tools to combat deliberate online untruths. Google's Asia-Pacific news lab lead Irene Jay Liu said its "fact-check tag" shows certain news stories have been checked by news publishers and fact-checking entities.Google is studying how to discern between low-and high-quality content, using algorithms to tell facts from falsehoods, she told Minister for Social and Family Development Desmond Lee.
Twitter's director of public policy and philanthropy in Asia-Pacific Kathleen Reen said it has made "an enormous amount of progress" since the US presidential election.
She highlighted Twitter chief executive Jack Dorsey's call for proposals this month to look into bad behaviour on the platform, with the firm sharing its data and funding researchers to join the effort.
She highlighted Twitter chief executive Jack Dorsey's call for proposals this month to look into bad behaviour on the platform, with the firm sharing its data and funding researchers to join the effort.
Tech giants argue against more laws to tackle fake news
However, they concede there could be gaps in Singapore's laws for quick action to be taken
By Yuen Sin and Seow Bei Yi, The Straits Times, 23 Mar 2018
Tech giants Facebook, Google and Twitter yesterday argued against the need for additional legislation to tackle the threat of online untruths, saying they are already taking steps to address the issue.
The companies told the parliamentary Select Committee on deliberate online falsehoods that they have been investing heavily in technology and schemes.
This includes developing algorithms that can flag less trustworthy content and prioritise authoritative sources, as well as partnerships with non-profit organisations that help them identify and take down offensive material.
"Prescriptive legislation will not adequately address the issue effectively due to the highly subjective, nuanced and difficult task of discerning whether information is 'true' or 'false'," Mr Jeff Paine, managing director of the Asia Internet Coalition (AIC), wrote in his submission to the committee, adding later that multiple stakeholders have to be engaged instead of rushing to legislate. The AIC, an industry association of technology companies, counts LinkedIn and Apple among its members.
Ms Kathleen Reen, Twitter's director of public policy for Asia-Pacific, said in her written submission that "no single company, governmental or non-governmental actor should be the arbiter of truth".
However, Mr Paine conceded during yesterday's hearing that there could be gaps in Singapore's existing laws for quick action to be taken against online falsehoods, when quizzed further by Select Committee members Law and Home Affairs Minister K. Shanmugam and Social and Family Development Minister Desmond Lee.
Speaking to a panel of representatives from Facebook, Twitter, Google and AIC, Mr Lee questioned the ability of technology companies to self-regulate.
He cited how YouTube has not completely removed a 2016 video by banned British white supremacist group National Action after more than eight months, even though British Home Affairs Select Committee chairman Yvette Cooper flagged it multiple times over the past year.
"Their experience is something that we look at with concern, being a much smaller jurisdiction... even in clear-cut cases, there has been inaction," Mr Lee said.
Mr Shanmugam noted that there can be a difference between what countries and social media platforms may tolerate.
He referred to a post on Twitter with the hashtag #DeportAllMuslims, which was accompanied by a graphic cartoon of a topless mother, surrounded by toddlers of varying ethnicities. The picture was titled "The New Europeans". The tweet had not been taken down even after being flagged, despite its offensive nature, he said.
"This was not a breach of Twitter's hateful conduct policy. If this is not a breach... I find it difficult to understand what else can be."
However, they concede there could be gaps in Singapore's laws for quick action to be taken
By Yuen Sin and Seow Bei Yi, The Straits Times, 23 Mar 2018
Tech giants Facebook, Google and Twitter yesterday argued against the need for additional legislation to tackle the threat of online untruths, saying they are already taking steps to address the issue.
The companies told the parliamentary Select Committee on deliberate online falsehoods that they have been investing heavily in technology and schemes.
This includes developing algorithms that can flag less trustworthy content and prioritise authoritative sources, as well as partnerships with non-profit organisations that help them identify and take down offensive material.
"Prescriptive legislation will not adequately address the issue effectively due to the highly subjective, nuanced and difficult task of discerning whether information is 'true' or 'false'," Mr Jeff Paine, managing director of the Asia Internet Coalition (AIC), wrote in his submission to the committee, adding later that multiple stakeholders have to be engaged instead of rushing to legislate. The AIC, an industry association of technology companies, counts LinkedIn and Apple among its members.
Ms Kathleen Reen, Twitter's director of public policy for Asia-Pacific, said in her written submission that "no single company, governmental or non-governmental actor should be the arbiter of truth".
However, Mr Paine conceded during yesterday's hearing that there could be gaps in Singapore's existing laws for quick action to be taken against online falsehoods, when quizzed further by Select Committee members Law and Home Affairs Minister K. Shanmugam and Social and Family Development Minister Desmond Lee.
Speaking to a panel of representatives from Facebook, Twitter, Google and AIC, Mr Lee questioned the ability of technology companies to self-regulate.
He cited how YouTube has not completely removed a 2016 video by banned British white supremacist group National Action after more than eight months, even though British Home Affairs Select Committee chairman Yvette Cooper flagged it multiple times over the past year.
"Their experience is something that we look at with concern, being a much smaller jurisdiction... even in clear-cut cases, there has been inaction," Mr Lee said.
Mr Shanmugam noted that there can be a difference between what countries and social media platforms may tolerate.
He referred to a post on Twitter with the hashtag #DeportAllMuslims, which was accompanied by a graphic cartoon of a topless mother, surrounded by toddlers of varying ethnicities. The picture was titled "The New Europeans". The tweet had not been taken down even after being flagged, despite its offensive nature, he said.
"This was not a breach of Twitter's hateful conduct policy. If this is not a breach... I find it difficult to understand what else can be."
He told the tech industry representatives: "The various beautiful statements you made... (have) to be tested against reality... For us in Singapore, this is way beyond what we would tolerate."
Facebook's Asia-Pacific vice-president of public policy Simon Milner pointed to difficulties in coming up with policies to tackle deliberate online falsehoods.
He highlighted that due process will be needed for a policy against online untruths, which is unlike "making a judgment on hate speech, or terrorism, or child sexual abuse - all the other areas of policy that we deal with".
"It is not that we are trying to abdicate our responsibilities, it is the particular notion of the kind of due process you require in order to be fair to people... that I think is more problematic for us than other policy areas," said Mr Milner.
He said that this is why using machine learning or proxies to nip the problem in the bud - a system that is still being tested - is what the platform considers to be the right approach.
He said that this is why using machine learning or proxies to nip the problem in the bud - a system that is still being tested - is what the platform considers to be the right approach.
Telcos highlight limitations to tackling online falsehoods
By Yuen Sin, The Straits Times, 23 Mar 2018
As Internet service providers, telcos Singtel and StarHub said they do not have the tools to monitor and selectively block the content of third-party materials sent through their networks.
For this reason, they are unable to adequately tackle deliberate online falsehoods on their own, both telcos told the parliamentary Select Committee on fake news yesterday.
Measures they have taken, like blocking a website, are also "blunt instruments" that cannot specifically target the root of the problem.
These limitations were highlighted in their oral and written representations to the committee.
By Yuen Sin, The Straits Times, 23 Mar 2018
As Internet service providers, telcos Singtel and StarHub said they do not have the tools to monitor and selectively block the content of third-party materials sent through their networks.
For this reason, they are unable to adequately tackle deliberate online falsehoods on their own, both telcos told the parliamentary Select Committee on fake news yesterday.
Measures they have taken, like blocking a website, are also "blunt instruments" that cannot specifically target the root of the problem.
These limitations were highlighted in their oral and written representations to the committee.
But they agreed that a multi-pronged approach should be taken in addressing the issue, including public education and legislation.
Explaining its limitations, Star-Hub's head of regulatory affairs, Mr Tim Goodchild, said that while it can, say, block customers' access to a single domain like Twitter or Facebook, it will not be able to restrict access to individual tweets or posts.
It is also possible for users to circumvent such measures, he added.
The situation is further complicated by the speed at which such posts can travel and become viral, he said when replying to Senior Minister of State for Communications and Information Janil Puthucheary, a committee member.
Explaining its limitations, Star-Hub's head of regulatory affairs, Mr Tim Goodchild, said that while it can, say, block customers' access to a single domain like Twitter or Facebook, it will not be able to restrict access to individual tweets or posts.
It is also possible for users to circumvent such measures, he added.
The situation is further complicated by the speed at which such posts can travel and become viral, he said when replying to Senior Minister of State for Communications and Information Janil Puthucheary, a committee member.
Mr Sean Slattery, Singtel's vice-president in charge of regulatory and interconnect matters, agreed.
He also said the role of public telecommunications licensees, like his company, is not to play "judge and jury" of content, especially in the light of the heightened privacy and security laws and push to encrypt content.
In fact, the Telecommunications Act prohibits such a licensee from familiarising itself with the content of a message.
But both telcos agreed fake news is a pressing issue. Singtel itself has been a victim of commercial scams out to obtain personal information or money. "There is very little recourse for us," said Mr Slattery.
Public education is just one element of the solution, both representatives told Dr Janil.
Public education is just one element of the solution, both representatives told Dr Janil.
Unlike tech giants such as Facebook and Twitter, both telcos said extra legislation may be needed to effectively fight deliberate online falsehoods.
"They are platforms that reach billions of people. If they want to be socially responsible, they do have an obligation to contain the spread of deliberate online falsehoods," said Mr Yuen Kuan Moon, Singtel's chief executive of Singapore consumer business.
"They are platforms that reach billions of people. If they want to be socially responsible, they do have an obligation to contain the spread of deliberate online falsehoods," said Mr Yuen Kuan Moon, Singtel's chief executive of Singapore consumer business.
15-year-old is panel's youngest witness to date
By Ng Jun Sen, Political Correspondent, The Straits Times, 23 Mar 2018
When 15-year-old Zubin Jain wrote to a high-level parliamentary committee and outlined his views about the fake news scourge, he did not tell anyone.
His parents were "utterly shocked" when he told them last month that he was invited to appear before the Select Committee on deliberate online falsehoods - the panel's youngest witness to date.
Said his mother Asiya Bakht, 44: "We were scared. I asked if it was okay to testify. Will he be in trouble? Has he lied? Because in the e-mail (to the committee), he said he might also have spread online falsehoods."
Her worries were misplaced: Committee members praised his bravery in stepping forward to give evidence.
The Grade 10 student at United World College of South East Asia confessed that he did not know what to expect. "I had my friend read out my submission this morning, and there were so many spelling errors that I thought they called me here to correct my spelling. That was my secret fear," he told The Straits Times yesterday.
Zubin, who blogs about economics and politics, said he decided to write to the committee as he has had arguments with friends and family whose views and beliefs were often formed as a result of them having accessed false information. And such beliefs could be hard to correct.
He once argued with an aunt who believed in homeopathy - or alternative treatment - and questioned the sources of her information: "But she said homeopathy worked for her many times. You can't really argue with that even with scientific evidence."
To this, Madam Bakht shook her head, saying: "That is him - too much fact-checking."
He told the committee that any legislation against fake news should not target individuals who do not have a clear malicious intent. "As someone who writes and blogs online, I did not want my own habits and my freedom to speak get taken away," he said.
By Ng Jun Sen, Political Correspondent, The Straits Times, 23 Mar 2018
When 15-year-old Zubin Jain wrote to a high-level parliamentary committee and outlined his views about the fake news scourge, he did not tell anyone.
His parents were "utterly shocked" when he told them last month that he was invited to appear before the Select Committee on deliberate online falsehoods - the panel's youngest witness to date.
Said his mother Asiya Bakht, 44: "We were scared. I asked if it was okay to testify. Will he be in trouble? Has he lied? Because in the e-mail (to the committee), he said he might also have spread online falsehoods."
Her worries were misplaced: Committee members praised his bravery in stepping forward to give evidence.
The Grade 10 student at United World College of South East Asia confessed that he did not know what to expect. "I had my friend read out my submission this morning, and there were so many spelling errors that I thought they called me here to correct my spelling. That was my secret fear," he told The Straits Times yesterday.
Zubin, who blogs about economics and politics, said he decided to write to the committee as he has had arguments with friends and family whose views and beliefs were often formed as a result of them having accessed false information. And such beliefs could be hard to correct.
He once argued with an aunt who believed in homeopathy - or alternative treatment - and questioned the sources of her information: "But she said homeopathy worked for her many times. You can't really argue with that even with scientific evidence."
To this, Madam Bakht shook her head, saying: "That is him - too much fact-checking."
He told the committee that any legislation against fake news should not target individuals who do not have a clear malicious intent. "As someone who writes and blogs online, I did not want my own habits and my freedom to speak get taken away," he said.
Robust exchanges as minister grills senior Facebook exec
By Elgin Toh, Deputy Political Editor, The Straits Times, 23 Mar 2018
Technology companies felt the full force of Mr K. Shanmugam's experience as a litigation lawyer yesterday, as the Select Committee on deliberate online falsehoods held its fourth day of hearings.
The Law and Home Affairs Minister dominated proceedings after the lunch break, with a lively, robust exchange with Facebook's vice-president of public policy for Asia Pacific Simon Milner.
Grilling Mr Milner on the Cambridge Analytica scandal that Facebook apologised for in the early hours of yesterday morning, Mr Shanmugam painted a picture of the social media giant being less than candid in its disclosures to its own users, and to a British parliamentary select committee, as it covered up a major breach of its data affecting 50 million users.
Perhaps fortuitously for the committee, the scandal unfolded right in the middle of its hearings, presenting itself as Exhibit A in the deliberations. Facebook CEO Mark Zuckerberg apologised in the United States hours before Mr Milner appeared in Parliament.
A data analytics and public relations firm, Cambridge Analytica illicitly harvested the data of 50 million Facebook user profiles, and later used it to influence voters on behalf of the successful Donald Trump presidential campaign.
When Facebook found out, it got the company to agree to delete the data. But it failed to inform users of the breach - an oversight it has now said sorry for. But it gets worse: Cambridge Analytica did not delete the data, and Facebook did not verify they had done so, but simply took the company's word.
Digging up the transcript of Mr Milner's testimony to British MPs last month, Mr Shanmugam cited statements by him that Cambridge Analytica did not have Facebook data, and put it to Mr Milner that he misled British MPs. There is "no excuse for not telling the whole truth and being full, frank and honest", said Mr Shanmugam.
Mr Milner said his answers were correct and based on what he knew at the time. But he conceded that he could have given a more complete accounting of the saga: "Do I wish I had given a fuller answer? Yes."
This was "one of the worst weeks" in his six years at Facebook, he said, adding that there was "a determination from the very top" to learn what went wrong and to make sure it does not happen again.
Protracted exchanges between the two men were at times heated, such as when Mr Shanmugam reprimanded Mr Milner for being "keen to answer questions I don't ask". At other times, they drew laughter. Mr Milner praised his interrogator for his sharpness of mind as a lawyer. Mr Shanmugam praised Mr Milner, too, saying: "You do better than most lawyers."
The two went on for three hours, as other witnesses from Twitter, Google and the Asia Internet Coalition sat in what looked to be an uncomfortable silence.
But Mr Milner pushed back vigorously too. At one point, he appealed to committee chairman Charles Chong to ask if it was the best use of the committee's time for Mr Shanmugam to go over in fine detail, statements that Facebook executives made to panels in other countries and to ask for Mr Milner's views. At parts of the exchange, the two also apologised to each other for the rise in temperature. Striking a conciliatory tone near the end, Mr Shanmugam said: "We look at you as partners." Mr Milner replied: "I appreciate hearing that."
What was Mr Shanmugam's goal? It appeared it was to show that while companies like Facebook controlled platforms on which the bulk of citizen discourse takes place, they did not show a level of responsibility equal to the power and influence they had.
They hold sway in modern democracies, and the content they carry can change election outcomes, divide communities, or become a channel for foreign state-sponsored disinformation campaigns. Yet, they are reluctant to take a firm stand on fake news.
Facebook did not want to be an arbiter of what was true, said Mr Milner. Indeed, what that meant is that even a news item as ridiculous as one in the 2016 US election campaign which said presidential hopeful Hillary Clinton ran a paedophile ring, was unchallenged.
Mr Milner explained that there have been instances when rumours turn out to be true. He added that Facebook will take down posts if it received a court order.
Other committee members also cited egregious posts that tech companies were slow to act on, including a neo-Nazi video that YouTube did not block for months, even after a panel of British MPs contacted YouTube.
Two points were clear from the exchanges. First is the impression that it may not be in the financial interest of social media firms to act. That could explain why Facebook did not inform users of the data breach, and why it was not keen to take down falsehoods - since falsehoods tend to drive traffic, and traffic draws advertising revenue.
Second, as the social media firms are US-based, they have a different instinct on what is acceptable speech or action. The US enshrines free speech - to the point that someone can even burn the Quran.
Against this backdrop, and if social media companies continue to be lethargic in their actions, Singapore may have little choice but to safeguard its own interests and introduce laws to compel these companies to abide by the social norms here and remove deliberate falsehoods.
By Elgin Toh, Deputy Political Editor, The Straits Times, 23 Mar 2018
Technology companies felt the full force of Mr K. Shanmugam's experience as a litigation lawyer yesterday, as the Select Committee on deliberate online falsehoods held its fourth day of hearings.
The Law and Home Affairs Minister dominated proceedings after the lunch break, with a lively, robust exchange with Facebook's vice-president of public policy for Asia Pacific Simon Milner.
Grilling Mr Milner on the Cambridge Analytica scandal that Facebook apologised for in the early hours of yesterday morning, Mr Shanmugam painted a picture of the social media giant being less than candid in its disclosures to its own users, and to a British parliamentary select committee, as it covered up a major breach of its data affecting 50 million users.
Perhaps fortuitously for the committee, the scandal unfolded right in the middle of its hearings, presenting itself as Exhibit A in the deliberations. Facebook CEO Mark Zuckerberg apologised in the United States hours before Mr Milner appeared in Parliament.
A data analytics and public relations firm, Cambridge Analytica illicitly harvested the data of 50 million Facebook user profiles, and later used it to influence voters on behalf of the successful Donald Trump presidential campaign.
When Facebook found out, it got the company to agree to delete the data. But it failed to inform users of the breach - an oversight it has now said sorry for. But it gets worse: Cambridge Analytica did not delete the data, and Facebook did not verify they had done so, but simply took the company's word.
Digging up the transcript of Mr Milner's testimony to British MPs last month, Mr Shanmugam cited statements by him that Cambridge Analytica did not have Facebook data, and put it to Mr Milner that he misled British MPs. There is "no excuse for not telling the whole truth and being full, frank and honest", said Mr Shanmugam.
Mr Milner said his answers were correct and based on what he knew at the time. But he conceded that he could have given a more complete accounting of the saga: "Do I wish I had given a fuller answer? Yes."
This was "one of the worst weeks" in his six years at Facebook, he said, adding that there was "a determination from the very top" to learn what went wrong and to make sure it does not happen again.
Protracted exchanges between the two men were at times heated, such as when Mr Shanmugam reprimanded Mr Milner for being "keen to answer questions I don't ask". At other times, they drew laughter. Mr Milner praised his interrogator for his sharpness of mind as a lawyer. Mr Shanmugam praised Mr Milner, too, saying: "You do better than most lawyers."
The two went on for three hours, as other witnesses from Twitter, Google and the Asia Internet Coalition sat in what looked to be an uncomfortable silence.
But Mr Milner pushed back vigorously too. At one point, he appealed to committee chairman Charles Chong to ask if it was the best use of the committee's time for Mr Shanmugam to go over in fine detail, statements that Facebook executives made to panels in other countries and to ask for Mr Milner's views. At parts of the exchange, the two also apologised to each other for the rise in temperature. Striking a conciliatory tone near the end, Mr Shanmugam said: "We look at you as partners." Mr Milner replied: "I appreciate hearing that."
What was Mr Shanmugam's goal? It appeared it was to show that while companies like Facebook controlled platforms on which the bulk of citizen discourse takes place, they did not show a level of responsibility equal to the power and influence they had.
They hold sway in modern democracies, and the content they carry can change election outcomes, divide communities, or become a channel for foreign state-sponsored disinformation campaigns. Yet, they are reluctant to take a firm stand on fake news.
Facebook did not want to be an arbiter of what was true, said Mr Milner. Indeed, what that meant is that even a news item as ridiculous as one in the 2016 US election campaign which said presidential hopeful Hillary Clinton ran a paedophile ring, was unchallenged.
Mr Milner explained that there have been instances when rumours turn out to be true. He added that Facebook will take down posts if it received a court order.
Other committee members also cited egregious posts that tech companies were slow to act on, including a neo-Nazi video that YouTube did not block for months, even after a panel of British MPs contacted YouTube.
Two points were clear from the exchanges. First is the impression that it may not be in the financial interest of social media firms to act. That could explain why Facebook did not inform users of the data breach, and why it was not keen to take down falsehoods - since falsehoods tend to drive traffic, and traffic draws advertising revenue.
Second, as the social media firms are US-based, they have a different instinct on what is acceptable speech or action. The US enshrines free speech - to the point that someone can even burn the Quran.
Against this backdrop, and if social media companies continue to be lethargic in their actions, Singapore may have little choice but to safeguard its own interests and introduce laws to compel these companies to abide by the social norms here and remove deliberate falsehoods.