4 years ago by crazygringo
I see a lot of comments misinterpreting this.
First, it's not about private files, it's about distributing content.
Google isn't spying on your private files, but does scan them when you share them publicly. E.g. keep all the pirated movies you want on your Drive, and even give private access to friends, but the moment you make them publicly viewable Google scans them and limits access accordingly. So no, this isn't applying to your private diary or privately shared documents.
And second, to those who claim absolute free speech with no limits -- notice that the two main categories here are related to democracy and health. All our legal protections ultimately depend on a democratic foundation -- undo that with misinformation and you don't have anything anymore. Similarly, your rights don't matter much if you're dead. Companies aren't allowed to advertise rat poison as medicine and neither are you.
4 years ago by contravariant
There's something fundamentally flawed about the idea that censorship in the name of preventing misinformation is protecting the foundation of democracy.
You cannot have true democracy if people cannot disagree with their governments, they must be able to disagree with any truth or opinion such a government might consider self-evident, just on the off chance they're right.
I should at this point note that Google doesn't directly claim to go quite that far in preventing misinformation, they mostly claim to disallow things that could harm the democractic process (e.g. telling people to vote at the wrong place, their candidate has died, etc.). At least that kind of information is usually agreed upon (if not there are bigger problems than mere misinformation), though they seem to try to include claims of voter-fraud, which is a bit dangerous.
4 years ago by hackererror404
Imagine if Britain had this same technology when the USA was founded... It of course would have quickly cracked down on communications and it would have done so in the name of "peace" and "what's right"...
This idea that thinking critically of a government and even believing that perhaps the government as it stands today is not the government "of and for the people" (sure could be interpreted as anti-democracy by that same corrupt government)... And maybe that's not correct, but who is the government to say that we can or cannot challenge them in public discourse as it is supposedly protected under the first amendment?
This is indeed an insanely slippery slope and people willing to trade their freedoms because they think it's for the ultimate good, I think are really making a mistake... it's not difficult to understand that this is one of the first steps of an actual fundamentally corrupt government... This is easily open to abuse and vast interpretation.
4 years ago by narrator
If the American revolution had happened 100 years later it would have looked like the Boer war. That's the war where the British used barbed wire and the machine gun to invent the concentration camp.
4 years ago by shadilay
This was explored before in an excellent blog post.
https://kieranhealy.org/blog/archives/2013/06/09/using-metad...
4 years ago by mohas
Welcome to Iran, UK of 17xx technology of 20xx regarding online monitoring and censorship
4 years ago by BrainWorm
`Magna Carta originated as an unsuccessful attempt to achieve peace between royalist and rebel factions in 1215`
4 years ago by nickysielicki
The bigger problem that I have with the idea that misinformation kills democracy is that it seems to suggest that misinformation is some new phenomenon or that the average person has been well informed throughout the history of western democracy.
Democracy thrived before the printing press. Democracy survived the invention of the printing press, which was mostly in the hands of magnates who could afford it. Democracy survived the invention of television and radio, which was (and still is) in the hands of a select few magnates. We build up terms like "journalistic integrity" and look at the past with rose colored glasses as if these mediums delivered pure objective truth.
If anything, what we're seeing with the internet is a more true democracy with a wider range of opinions, less controlled by small groups of plutocrats. If you don't like to see the death of that plutocracy, or you're happy to see a new group of benevolent plutocrats come in to retake control the narrative, I hate to be the one to tell you this, but you don't really like democracy.
4 years ago by acituan
> suggest that misinformation is some new phenomenon
Misinformation in this shape and form is a new phenomenon. And it is not just the scale;
- the number of agents that push their version of misinformation is at least an order of magnitude higher than ever, depending on the particular topic. So-called culture wars have so many different sides.
- technology not only scales misinformation, but it also accelerates it. The objective function of "increased engagement" meshes very well. Hard to grok, full fidelity facts don't get shared or recommended as much as rage-baiting or bias-confirming material.
- technology can on-the-fly piece together material to conform to whatever bullshit you want to hear, I want to hear or the other guy wants to hear. As it is optimized to increase engagement, it can efficiently generate personalized micro-narratives, which is ultimately a reflection of our personal biases.
The problems is it gets harder and harder for these narratives to converge. More on that below.
> If anything, what we're seeing with the internet is a more true democracy with a wider range of opinions, less controlled by small groups of plutocrats
As mentioned, original thoughts don't have the same propagation speed or reach as junk-infotainment, and you're just as subject to the narrative-shaping powers of those "plutocrats" as ever. They just blend in better.
But the larger issue is that you can't equivocate mere plurality with a functioning democracy. Ultimately there is a single reality, and even though we are in divergent positions due to having different entry points and framings, we should be - however little - converging in our narratives and understanding of that reality as time progresses.
But the opposite seems to be happening, we are getting dumber at scale, stuff makes less sense, institutional mistrust is at all-time-high. I am not putting this all on tech, but it certainly pours fuel on the fire of meaning-making crisis.
I wouldn't take it for granted that we could survive this without it creating a larger crisis first.
4 years ago by shrimpx
What Iâm seeing is a disintegration of the narrative, with a relatively small group of disinformation plutocrats bombarding minds at scale with conflicting positions.
A compassionate view of humanity would say that humans are basically accepting. This openness can then be abused by viral misinformation. We could take the view that humans should just be self protecting and if they got duped thatâs on them. But IMO thatâs a depressing view of the world, and tends toward something like mutually assured social destruction in the limit. We need to protect our shared narrative.
Also personally I find the view that âdemocracy prevailed before, itâll continue to prevail somehowâ deeply unsatisfying. Democracy is not built into nature. It has to be proactively maintained and refreshed.
4 years ago by himinlomax
> Democracy thrived before the printing press
No it didn't.
Do you have examples of democracies before the printing press besides Athens and arguably the late Roman Republic? Those happened nearly two millenniums before the press and didn't last that long. "Thriving"?
4 years ago by tshaddox
When and where did democracy thrive before the printing press?
4 years ago by crazygringo
> There's something fundamentally flawed
There isn't really. You're adopting, I assume, J.S. Mill's view, that the cure for bad speech is more speech, which he famously published in 1859.
However, since then it's been widely accepted that when speech reaches a certain level of harm then the greater good is to prevent/punish it. You can't incite violence under the guise of free speech. You can't advertise that something is safe when it's not. This is because more speech can't undo violence and death after it occurs.
And when it comes to misinformation with regards to provable and intentional lies about voting procedures, election results, etc. that falsely harm the country's institutions and legitimacy, it's entirely consistent for that to fall under the widely-accepted prohibition of speech that rises to a certain threshold of harm. It directly leads to mobs, riots, and revolution based on lies, not based on actual injustices.
This doesn't mean any harmful speech is prohibited -- that's ridiculous. You're generally allowed to insult people, tell lies, etc. But there's a threshold of harm that gets established.
4 years ago by simonbarker87
Annoying that you are getting down voted for what seems to be a very reasonable comment.
I have very few friends âin techâ and this is the view that basically all of them hold, this is the view that most of my family hold. Across the 100-150 people that spans the full (European) political spectrum and many different backgrounds and life experiences from growing up extremely wealthy, finding wealth through hard work (and luck) and success and borderline surviving - do not âSV tech circlesâ.
Basically itâs a view point that is able to accept nuance and grey. People who work in absolutes dominate headlines so itâs all we hear, in reality the majourity of people live in the middle.
4 years ago by logicchains
>However, since then it's been widely accepted
It's not widely accepted at all, except maybe in the small circle of Silicon Valley elites whose ideas align with yours politically.
4 years ago by fighterpilot
> the greater good is to prevent/punish it.
It's clear that a handful of genocides were caused in large part by hate speech, such as the Rwandan genocide and the Holocaust.
What's not clear to me (although I'm open either way) is whether strict hate speech laws would've reduced the odds of these happening. Do we have reason to think that to be true?
The first order effect is to chill that kind of speech. But is there a second order effect of making these people into martyrs and fostering resentment towards the protected group that does more harm than good?
My understanding is that pre-Nazi Germany had hate speech laws, and it didn't seem to work there?
https://www.bjpa.org/content/upload/bjpa/4_an/4_Anti-Semitis...
4 years ago by daddylongstroke
>It directly leads to mobs, riots, and revolution based on lies, not based on actual injustices.
Would love to see an example. The vast majority of revolutions in human history were results of hunger.
4 years ago by weare138
>You cannot have true democracy if people cannot disagree with their governments, they must be able to disagree with any truth or opinion such a government might consider self-evident, just on the off chance they're right.
Google is not the government. It's a private company owned by private citizens who also have the same constitutional rights. You're not being 'censored'. It's not a violation of your free speech. You're free to petition the government and you're free to just host your files someplace other than Google Drive. Access to Google Drive is not a 'right'. Quit trying to conflate the two. It's a disingenuous argument meant to confuse the issue and push a personal narrative.
4 years ago by freebuju
> You're not being 'censored' .. It's not a violation of your free speech... You're free to petition
The greatest trick the Devil ever pulled was convincing the world he didn't exist.
> It's a private company owned by private citizens who also have the same constitutional rights.
The scale at which Google operates today, anything short of defining some of its popular products as public utilities would be disingenuous.
> Access to Google Drive is not a 'right'
Except this is not a Google drive (the product) access issue. It is an access to information issue, if you look at it fundamentally.
4 years ago by ptx
By this argument China's censorship of online messaging platforms doesn't exist either, because Weixin and Weibo are operated by private companies, so what they're doing when they block messages with undesirable content isn't censorship.
4 years ago by mjthompson
I'm not even close to convinced by your response. Relying on the public-private divide as the sole basis for your retort is weak. You also assert that the person is pushing a personal narrative, but I suggest you're doing the same.
There's an argument that private corporations that are involved in dissemination of information (search engines and social media) should respect principles of freedom of speech as a democratic principle, regardless of constitutional mandate.
Suppose the government outsources welfare eligibility decision-making to a private company. Does this mean traditional notions of fairness we would expect from such an decision-maker do not apply, because they are a private company?
4 years ago by wisty
If something is bad when a government does it, surely it's also bad if a really large company does it.
Come to think of it, why would it be bad if a government censors speech?
4 years ago by wes-k
> Companies aren't allowed to advertise rat poison as medicine and neither are you.
You may want a different example :).
> Warfarin first came into large-scale commercial use in 1948 as a rat poison. Warfarin was formally approved for human use by the US FDA to treat blood clots in 1954.
4 years ago by mastazi
> when you share them publicly
from the wording in the link[1], it seems even if you share the document with just one single person, and that person flags it, Google is then allowed to investigate. So, the pre-condition is not sharing publicly, just sharing.
> So no, this isn't applying to your private diary or privately shared documents.
Well this seems to be inaccurate based on the text cited below[1], do you have sources that back your claim? There is nothing saying that privately shared documents can't be reviewed. The only necessary condition seems to be just that someone flagged your content, which could be the one person you shared that content with.
[1] "After we are notified of a potential policy violation, we may review the content and take action, including restricting access to the content, removing the content, and limiting or terminating a userâs access to Google products".
4 years ago by zekrioca
It is funny how people blindly trust Google on this..
4 years ago by wutbrodo
> All our legal protections ultimately depend on a democratic foundation -- undo that with misinformation and you don't have anything anymore. Similarly, your rights don't matter much if you're dead.
This is an insane conclusion to draw, because it blithely ignores that it positions Google as an oracle of what's threatening to democracy and what isn't.
The examples in the current policy are fairly narrow, but this is a categorical line being crossed (for better or worse). Those who are concerned by the increasing encroachment of effective utilities on what can be communicated need to speak up when clear lines are crossed, because it's the only way to avoid frog-boiling.
4 years ago by wwweston
It's funny to see people freak out about this when entire foundation of Google is built on the proposition that it can make effective judgments about the relative value of various content out there on the web.
Like, if Google has a hidden agenda here that makes it any more fundamentally compromised as a judge of disinformation than anything else, then whether or not it will be a free CDN for arbitrary content isn't remotely our biggest problem.
And if they're just that bad at sorting information from disinformation in spite of their considerable resources and ostensible value proposition, same thing goes, although that's an opportunity for someone else to the extent that there's a market for understanding reality.
So, yeah. Google by its nature is going to play a role regarding what's a threat to accurate understanding and democracy. It's not alone in this; journalism does it. The academy does it. The courts do it. Businesses do it.
The way you get a well-functioning society where robust discourse turns into better perception and refined ideas isn't that everybody takes a hands off approach, it's that everybody -- all institutions and individuals -- take responsibility.
Not to mention that requiring Google (or anyone) to carry and disseminate information that they consider irresponsible... well, compelled speech isn't exactly freedom of speech.
4 years ago by concordDance
> Like, if Google has a hidden agenda here that makes it any more fundamentally compromised as a judge of disinformation than anything else, then whether or not it will be a free CDN for arbitrary content isn't remotely our biggest problem.
Indeed, the bias in Google Search is actually a very serious problem.
4 years ago by wutbrodo
> It's funny to see people freak out about this when entire foundation of Google is built on the proposition that it can make effective judgments about the relative value of various content out there on the web.
You're thinking of Google Search, their web search product. The thread is talking about Google Drive, their file-storage and - sharing product. If you don't get the difference between these two products and the expectations around them, I don't know what to tell you.
4 years ago by intended
This positions google as an intermediary.
Someone else has figured out what is misinformation and is telling google.
4 years ago by wutbrodo
The idea that _anyone_ can reliably enough decree what's misinformation is ludicrous. I acknowledge that there are vast masses of incredibly stupid people that need to pretend that Truth is handed down on clay tablets by God in order to function. We're much better off with the scientific establishment wearing this mantle than, say, religious institutions. And its pointless to try to convince these people that science is an iterative, incremental process that's based in skepticism, not certainty.
But the minority of society that understands and participates in the process of truth-formation (including scientists!) produces a widely disproportionate amount of epistemic value, and society depends on this process for basic functioning.
It's amazing to me that this isn't clear to everyone after the pandemic, of all things. The amount of claims that were banned from social media as "misinformation" that became expert consensus a couple of months later is mind-boggling. Following smart and quantitative people on Twitter was wayyyy more likely to provide you a healthy and safe pandemic experience than following the incoherent and self-contradictory public health recommendations (let alone policy). More important than this "direct-to-consumer" ability to discuss the pandemic is that experts themselves form their opinions through this type of discussion. The notion that there's a "someone else" who has reliably figured out which dissent is out of bounds is laughable.
I'll note again that Google's current policy is limited to fairly simple things, but it's an important Schelling fence being torn down and worthy of commenting on (and pushing back against, if yiu believe the trend is harmful).
4 years ago by irthomasthomas
I just cancelled my drive subscription.
I don't know how anyone could continue using it after this.
I probably have dozens of docs and hundreds of research papers contradicting government health advice on diabetes and heart disease. These would fall under "Misleading content related to harmful health practices" since they promote a health theory which the government considers harmful.
However I would have cancelled regardless since the idea of automatic bans and/or content deletion based on ML models is crazy. They are obviously going to find a lot a false positives and I can't deal with the idea of trying to speak to google to explain that their algorithm mistakenly flagged my innocent content. In other words even if you are the perfect citizen, there is a chance you will get flagged anyway.
4 years ago by tyingq
Yes, this is really bizarre. I get this kind of policy for some kinds of platforms, but not Drive, Docs, Sheets, Slides, and Forms. Those are my documents, and I should be free to put whatever I want in them.
What if I just like to collect and share old conspiracy theory stuff that I know is wrong? For whimsy, historical, whatever purposes...
4 years ago by soheil
This policy does not apply to private files. I just want to point out based on your comment that none of what you mentioned matters unless you share the document publicly.
4 years ago by mastazi
> unless you share the document publicly
Based on Google's wording[1], it seems you just need to share that document with one single person; if that person flags it, at that point Google is allowed to investigate even if the document is not public.
[1] "After we are notified of a potential policy violation, we may review the content and take action, including restricting access to the content, removing the content, and limiting or terminating a userâs access to Google products."
4 years ago by tyingq
"What if I just like to collect and share old conspiracy theory stuff"
The key word there is "share". Lots of people use the "share with anyone that has the link" for limited sharing.
4 years ago by intricatedetail
> This policy does not apply to private files.
Yet. Once people get used to it, it will be extended to private files. Likely they will even build it into Android and create an API to report citizens storing questionable documents.
4 years ago by maxk42
Some commenter replied it will change nothing. I disagree (but didn't downvote) - it will change the number of people in the market for a competitor. There are competitors out there and the people who are cancelling their Drive subscriptions here are going to support them financially, building a viable rival to Drive with their dollars. More competition is one of the best possible outcomes and I fully support it. Please cancel your Google subscriptions, folks!
4 years ago by intricatedetail
If this comes from government, then every competitor will eventually have this. Google was probably asked to test idea and the impact.
4 years ago by soheil
This policy does not apply to private files. I just want to point out based on your comment that none of what you mentioned matters unless you share the document publicly.
4 years ago by jcadam
Yep. I just signed up for a paid email service too.
Time to migrate everything off of google services.
4 years ago by ibbibby
"Misinformation" is just another word for "falsehood" or "untruth."
Those of you claiming that "democracy" depends on authorities preventing the spread of misinformation are ipso facto saying that democracy requires the government, or megacorporate cartels with a monopoly on public speech most likely acting as proxies for the government (as Psaki made clear is happening), to define what counts as "truth" (a Ministry Of Truth if you will) and to stamp out what they've defined as "false."
It's insane, and it's amazing to me how many of you have your heads so far up your assessment with partisanship that you can't see that the recent media hysteria over "misinformation" is a blatant example of the contrived "emergencies" that all totalitarian regimes in history have used to seize control over free societies.
4 years ago by acituan
> "Misinformation" is just another word for "falsehood" or "untruth."
Thatâs not sufficiently true. In fact, asserting untrue propositions is one of the easiest-to-counter ways of misinformation.
Real pros use humbuggery; of a set of n true propositions, pick a subset m to lead the audience to your conclusions and you havenât even âliedâ.
Thatâs why âfact checkingâ is such a popular way of narrative laundering, because truthiness of individual propositions alone never reveal if someone was bullshitting you.
Thatâs also why the courtroom maxim is âtruth, nothing but the truth, and the whole truthâ. Only those 3 properties in combination would exclude misinformation. (Not saying courtrooms necessarily live up to this maxim.)
I agree with the spirit of the rest of your argument.
4 years ago by leereeves
> Real pros use humbuggery; of a set of n true propositions, pick a subset m to lead the audience to your conclusions and you havenât even âliedâ.
I've never heard the word humbuggery before, but I completely agree with the rest. Before social media we used to call that "choosing what to cover". It's also called a "lie of omission", so any censor who suppresses true information can reasonably be accused of lying (or misinformation) themselves.
As others have said, it's not new, but now, for the first time in US history, the media moguls are censoring not only their own broadcasts, but everyone's communications. Could America have ever developed as it has if the postal service or phone company had done that?
4 years ago by exporectomy
Though that's what happens, I don't think people call that misinformation but rather bias. Isn't misinformation factually wrong in the common meaning?
4 years ago by username90
People rarely go and read the actual article so the headline must be accurate on its own or you misinform the public. Reversing or strongly altering the statement made in the headline in the actual article doesn't mean it is no longer misinformation, the damage is already done as the masses read the headline and now thinks it actually happened that way. Yet this seems to be completely acceptable even in most reputable news-sources.
4 years ago by admax88q
I think it's less about being factually wrong, and more about leading people to factually wrong conclusions with truthful statements.
Even just saying "X sells stock Y before event Z" imples that X knew about event Z and that it would affect the stock price of Y. People will read headlines like this and walk away assuming there was insider trading, but that may not be the case. Nothing in that example headline has to be false in order for it to spread falsehoods.
4 years ago by the_other
> Thatâs not sufficiently true. In fact, asserting untrue propositions is one of the easiest-to-counter ways of misinformation.
Tell that to the children ICE detained separately from their parents.
4 years ago by the_other
(I confess I replied in a knee-jerk reaction to your set-up, rather than your main point. Sorry for that. Your main point has meat on its bones and seems worthy of further discussion.)
4 years ago by kofejnik
What about American citizens who are separated from their children when arrested?
4 years ago by rytill
Wouldnât you just need âthe whole truth and nothing but the truthâ?
4 years ago by thrwoi4234234
It's actually worse - the "truth" necessarily implies that there is but one, and that everything else in false.
Does this sound familiar ? This is exactly what religious loonies say in order to take control.
Science necessarily involves keeping your own ignorance, epistemic and otherwise, in mind while dealing with things, but it's quite worrying that the West is going back on what was won with blood and sweat.
4 years ago by adflux
The lab leak theory was discarded as false and conspiracy thinking, now many experts believe this to be the case... What is a conspiracy one day can eventually be the truth, e.g. the Tonkin Incident.
4 years ago by atoav
I am not aure about the US but here in Germany most experts didn't say it was a conspiracy theory, but that those who claim this is the truth lack the data to back it up.
I could also argue that there is a invisible unicorn orbiting the solar aystem. As long as there is no real proof for it we have to accept that it is just a theory. And the more facts align with my theory the more motivation there should be to check my theory by trying to disproove it.
SARS-1/MERS was prooven to stem from bats in the same region, so assuming that instead of a lab leak theory was more in line with known/knowable facts than a lab leak theory. When the facts changes theories change, that is science.
4 years ago by pcrh
On the contrary, no expert believes the coronavirus lab leak hypothesis is correct; a few do however say it should be investigated.
This is exactly this kind of misunderstanding that this whole thread is about.
4 years ago by FeepingCreature
I believe there is only one truth, and everything else is false.
But I also believe humans do not have access to it.
4 years ago by pixl97
I call this thermodynamic truth.
And while it is the sole arbiter of truth, the moment something occurs that truth starts decaying via entropy. Photons fly away at light speed never to be seen by us again. The energy that remains starts mixing in ways that cannot be reversed. You quickly lead to scenarios where more than one initial state could lead to the current state we can measure.
And worse we can never exist in a system where we capture and keep this information. You either alter the 'experiment' by measuring it, aka chaos theory. Or, you bring about the premature heat death of the universe.
4 years ago by mlac
There are facts and there is the context of the facts and the impact those facts have on people.
One can argue the news should just report the facts, but they add additional context and information to explain why the facts matter.
Verifying the facts / truth is objective and clear (e.g. it rained 2 inches today). Determining whether the impact is properly reported (e.g. âdevastatingâ flooding occurred) is murky. And the flooding could have been devastating - to one family, to a village, to a school. So itâs not untrue, itâs just more subjective as you move from numbers to impact. And the news cares more about reporting impact than facts and will tailor the narrative to explain the impact to their audience.
Look at the news service all sides. You can figure out the facts (e.g. a law was passed) then see what each side is saying about the impact. The impact may be true for both sides, just presented in a vastly different way.
4 years ago by dudeman13
I agree with this position.
The idea that there may be 'truths' sounds utterly bonkers to me. a ^ -a is considered a contradiction for a reason.
4 years ago by nix23
>I believe there is only one truth, and everything else is false.
That is maybe true with science, but not with living breathing things.
Just one example:
What is the best system to live in? Capitalism, Socialism or a mix of Capitalism AND Socialism?
Often it is just not a question of truth when it comes to humans.
4 years ago by eecc
Nog sure about the intent of Google (and as they say, the road to hell is paved with good ones) but youâre reframing. Itâs not a matter of censoring deviations from orthodoxy, rather one of removing disinformation and demonstrable falsehoods, often used for propaganda and to setup victim scenarios.
4 years ago by argvargc
Those engaging in widespread censorship create disinformation and falsehood, by omission.
In history, it has always been those engaging in widespread censorship who turn out to be disastrously and/or maliciously wrong.
The science now being censored has become so well-established, that at this point, Google/YT et al, has and will delete and suppress the sharing of peer-reviewed science published in mainstream journals and indexed in PubMed.
That 100% ends their credibility. I deplore anyone expecting an explanation as to why.
Anyone yet standing by such incredulous, irresponsible and/or actively-malicious action, reveals themselves as same, for all to see.
4 years ago by raverbashing
> the "truth" necessarily implies that there is but one, and that everything else in false.
Sounds like the QAnon people
4 years ago by vixen99
> Companies aren't allowed to advertise rat poison as medicine and neither are you. https://en.wikipedia.org/wiki/Warfarin
In fact, they are and they do. (Ok, not you unless you run a pharm company). The best known rat poison is warfarin, an anticogulant used world wide as a medicine under various names.
As to your main point, how do you or others on SM, define misinformation? Do you believe the shadowy folk (qualifications unstated) who pontificate at FB, Twitter, Google and Wikipedia? That has to be absurd and shocking. Anyone who wants to (and wants to seems to be the issue), can see the fatuity of this after searching for overturned consensus views as supported by the SM platforms mentioned. The search should include the peer-reviewed literature.
A most recent volte-face relates to Covid origin, In April it was undoubtedly of natural origin as we were authoritatively informed by the Lancet. Now in July, the previously regarded conspiracy theory is taken seriously by people able to make a judgment. Who does one believe, the unknowns at SM or someone like Peter Palese, https://labs.icahn.mssm.edu/paleselab/) who was among 27 scientists who had earlier signed the Lancet letter denouncing as âconspiracy theoriesâ the notion that the coronavirus could have escaped from a lab â or even be man-made. He now disavows that claim as do many others of similar status.
4 years ago by 1vuio0pswjnm7
Except the term "misinformation" appears nowhere on the page https://support.google.com/docs/answer/148505
Instead it refers to "misleading information".
Absent a specific definition of "misinformation" from Google's lawyers -- who likely authored or at least reviewed this page -- we are left to consult the dictionary.
The dictionary defines "misinformation" as "information that is incorrect". There is no requirement of intent. The information may or may not be misleading.
That is, the term "misinformation" may apply to any incorrect information regardless of intent.
Is it possible to have incorrect information ("misinformation") that is not intended to or does mislead (~ "misleading information"). Your answer: ___
Is it possible to have correct information (~ "misinformation") that is intended to or does mislead ("misleading information"). Your answer: ___
4 years ago by samiran149
Is it possible to have correct information (~ "misinformation") that is intended to or does mislead ("misleading information"). Your answer: ___
No need to answer this question. The technodemocratic complex has already answered it. It's tawdry, but the hunter biden laptop was "misinformation", because it was intended to mislead people away from Biden.
Doesn't matter that it was correct, the evidence was good, and it was published by major newspapers. The technical elite agreed with the political elite, and it was struck from the internet.
Amazingly, when it was raised at the presidential debate, real-time polling suggested that the majority of the populace had no idea what was being talked about. The suppression was effective.
We live in dangerous times.
4 years ago by _7cdn
âEvery record has been destroyed or falsified, every book rewritten, every picture has been repainted, every statue and street building has been renamed, every date has been altered. And the process is continuing day by day and minute by minute. History has stopped. Nothing exists except an endless present in which the Party is always right.â -George Orwell, 1984
4 years ago by grumblenum
Please read The Captive Mind for the version based on true events.
Spoiler: the Poles discovered that a political movement with methods and motivations which should sound very familiar didn't turn out to be the good guys after all.
4 years ago by semitones
The road to hell is paved with "good intentions". We desperately need to find a globally adoptable alternative to google and the services that it provides. Docs, Sheets, Drive, etc. are fantastic services in that they work really well on a massive scale. However, Google's increasing role as an arbiter of right vs wrong and a steward of information puts too much power into the hands of one corporation, whose best interests are provably not aligned with that of the general population.
I've been working as a SWE at google (in ads...) for over two years and I've really started to loathe it over the past year. The pay is fantastic and it's really hard to walk away from that, but the idea that they are not (or at least no longer) contributing to the better world that I think we need, has started to weigh heavier and heavier on me...
We should be able to implement services like these, that are free of ads, on globally distributed infrastructure, with no central authority, to have truly free-flowing information.
edit: added quotes around "good intentions"
4 years ago by izacus
I think your fundamental error is in the fact that you think that a private company (and market competition) can fix these issues. It seems that many people on HN are just waiting for the new Savior company, that will magically have incentives to fight for them instead of making money. It's like hoping for market competition create health regulation in the food industry.
Turns out, not even Apple is that messiah, and perhaps the solution isn't in demanding private companies to be your regulators and defenders of good morals and truth. What happened to having specialized agencies regulate and inspect industries?
4 years ago by duxup
The other truth is we're all outraged when these companies host some stuff that we don't like ... and get upset when they don't host the stuff we do like.
Consumers aren't rational. Neither are their demands.
I'm probably no more rational than anyone else, but I'm honest that I sure as hell don't want to give money to a service that is happy to host some violent folks content / garbage...
4 years ago by throwaways885
> The other truth is we're all outraged when these companies host some stuff that we don't like
Please, speak for yourself. I think these companies should host absolutely everything[0]. Between 2010-2016 was the golden age for these companies actually being free and open.
Edit: Within the law. To be honest, there is very little I see that should be censored beyond CP.
4 years ago by logifail
> we're all outraged when these companies host some stuff that we don't like
I'd argue that there is already a (fairly) tried and tested process in place to deal with this, it's the legal system.
There are plenty of media outlets that publish stuff I don't particularly like, but almost none of it is illegal, so - to be blunt - I just have to suck it up.
Some of my friends have opinions that I - at times - violently disagree with, but I file that under one of the side effects of life, and I deal with it.
I'm rarely "outraged" by companies hosting stuff. If it's illegal, knock yourself out and get it taken down.
However if it's just really, really annoying or you find it against your own worldview, perhaps take a deep breath / drink a cup of tea* / go to the gym / hug your OH, and move on to something more important?
* or gin :)
4 years ago by ceilingcorner
> The other truth is we're all outraged when these companies host some stuff that we don't like ... and get upset when they don't host the stuff we do like.
Is this actually true? I only think certain fringe Twitter groups are mad that companies host controversial things.
4 years ago by LudwigNagasena
I am absolutely happy if a company is ready to host everything legal.
4 years ago by TazeTSchnitzel
I think another error is assuming that having all content within a few hyper-scale hyper-global ad-supported commercial repositories of everything is a natural or healthy state of affairs. Many small websites dedicated to particular things is IMO generally better both from a free speech and a moderation standpoint than these giants that have to thread an impossible needle. In other words, web 1.0 was better.
4 years ago by rambambram
Can't agree more.
4 years ago by ksec
>Turns out, not even Apple is that messiah
They were, just no longer the same under Tim Cook.
I dont want to derail the discussion into another political debate but my thesis, is that some ideology spread like plague in Silicon Valley. The Good vs Evil. As the OP said Google stated off being good, but somewhere along the line the definition of Good got twisted a little bit. They keep thinking they were so righteous they literally started a crusade or witch-hunt ( so to speak ).
And it is in some way interesting because it rhymes with many historical events.
4 years ago by takoid
Quite relevant quote:
âPower tends to corrupt, and absolute power corrupts absolutely. Great men are almost always bad men, even when they exercise influence and not authority, still more when you superadd the tendency or the certainty of corruption by authority. There is no worse heresy than that the office sanctifies the holder of it.â
- John Dalberg-Acton
4 years ago by magicalist
>> Turns out, not even Apple is that messiah
> They were, just no longer the same under Tim Cook.
Steve "we have a moral responsibility to keep porn off the iPhone" Jobs liked making devices for everyone but definitely not for everything they might want to do on them.
4 years ago by JasonFruit
What makes you think the government is that Messiah? Is it likely, in your view, that the government will go out of their way to "encourage the spread of misinformation"? I'm not seeing that happening.
4 years ago by bdcravens
I don't think izacus suggested it was. My read: there is no messiah, and a system of checks and balances is how we protect the public interest.
4 years ago by duxup
>What makes you think the government is that Messiah?
I didn't get that impression from that post...
4 years ago by paulluuk
In the EU, governments and actually trying very hard to discourage the spread of misinformation, as well as passing legislation that the tech sector has always claimed was not needed. So yes, it's very likely, as it's already happening.
4 years ago by hash872
The issue is that we know from experience, after 20+ years of the modern Internet, that if you make a 'free speech' drive/repository place that's widely available, it will host the absolute worst of the human race. Then, let's say you personally were in charge of said Free Speech Drive- every day you'd get up and hear about people using it for (legal) jailbait photos, Islamic State recruiting, collaboration between extremist militia groups in various countries (including your own), actual illegal content, and so on. Pretty soon the FBI & CIA start contacting you about some of the actual or borderline illegal content being hosted on Free Speech Drive. Do you want to deal with that?
For one thing, it's easy to say 'well we'd only take down illegal content'. But in practice there isn't such a bright line, there's lots of borderline stuff, authorities could rule something posted on your site illegal after the fact- lots of these situations are up to a prosecutor's judgement call. Would you risk jail to push the boundaries? Coordinating 1/6 wasn't necessarily illegal until- it was.
If Islamic State is recruiting on Free Speech Drive, posting manifestos, encouraging Western residents to actual jihad- you wouldn't take that down? You'd leave it up if it hewed up to the line of being legal- really? Jailbait or non-nude pics of someone's teenage daughter, hosted in the thousands- you wouldn't take that down? It's easy to be an absolutist in an Internet argument, it's much harder when you face the sort of everyday content moderation issues you see in the real world
4 years ago by kmeisthax
Another wrinkle in all of this is that you can use free speech as a form of censorship.
For example, if someone says something you don't like, you can intimidate them into shutting up by, say, threatening to reveal their personal information, such as their legal identity, address of residence, and so on. On certain corners of the Internet, merely dropping dox is good enough to get randos (who won't even be affiliated with you, so +1 to plausible deniability) to harass someone you want to shut up.
A more technical variant of this is DDoS attacks. Instead of trying to intimidate someone into shutting up with threats of stochastic terrorism; you shout over them by sending a bunch of traffic to their site until the server crashes or they run out of money.
So even if you're a hardcore free speech extremist, you still need to embrace some level of "censoring the censors" if you want the Internet to actually be usable.
4 years ago by hash872
Agreed. That's not even getting into just pure spam, which from people like Alex Stamos I've heard is 100-1000x the issue that culture war content moderation is. Once you've accepted that a platform can remove the kind of spam that killed MySpace- or doxing or a DDoS attack, as you say- you're already on the (common sense IMO) road to content moderation. Which again, from 25+ years of the modern Internet, we know is just mandatory to have a useable site
4 years ago by native_samples
That's not censorship though. Threats are what people are forced to do when they cannot censor you, as censorship is much more direct. And DDoS attacks aren't speech.
4 years ago by mrtksn
> threatening to reveal their personal information, such as their legal identity
What's the problem with that? Bad things on the internet happen more often than not because of the lack of responsibility.
Doxxing has become the primary sin in the Internet religion but it would solve all kind of problems. I am going to commit that sin and say that Doxxing is the solution, you can downvote me and make my comment greyed out and censor me when you argue against censorship.
Instead of deleting content, simply make sure that it's linked to someone who can pay for it if it turn out to be something to be payed for.
The Anonymity argument is only good when you are actively persecuted by a state actor. I don't agree that you deserve anonymity because the public will demonise you. If you hold strong believes that can be met harshly by the general public, you better be ready for the pushback and think of ways to make it accepted. That's how it has been done since ever.
Therefore, when a content is questionable maybe the users should be simply KYC'ed en left alone until a legal take down order is issued. If its illegal(like illegal porn, copyrighted content, terroristic activities etc), go to prison for it. If its BS get your reputation tarnished.
4 years ago by the8472
> That if you make a 'free speech' drive/repository place that's widely available, it will host the absolute worst of the human race.
That's only due to selection effects. If being open were the default then they'd be diluted among all the other people. ISPs themselves, (older) reddit, 4chan all serve as examples that the people you don't want to talk to can be mostly siloed off to some corner and you can have your own corner where you can have fun. Things only get problematic once you add amplification mechanisms like twitter and facebook feeds or reddit's frontpage.
> For one thing, it's easy to say 'well we'd only take down illegal content'. But in practice there isn't such a bright line, there's lots of borderline stuff, authorities could rule something posted on your site illegal after the fact- lots of these situations are up to a prosecutor's judgement call. Would you risk jail to push the boundaries?
I don't see how that's an issue? They send a court order, you take down the content is a perfectly reasonable default procedure. For some categories of content there already exist specific laws which require takedown on notification without a court order, which exactly depends on jurisdiction of course, in most places that would be at least copyright takedowns and child porn.
> Pretty soon the FBI & CIA start contacting you about some of the actual or borderline illegal content being hosted on Free Speech Drive. Do you want to deal with that?
That's pretty much what telcos have to deal with for example. Supposedly 4chan also gets requests from the FBI every now and then. It may be a nuisance, but not some insurmountable obstacle. For big players this shouldn't be an issue and smaller ones will fly under the radar most of the time anyway.
Also, having stricter policies doesn't make those problems go away. People will still post illegal content, but now in addition to dealing with the FBI you also need to deal with moderation policies, psychiatrists for your traumatized moderators (which you're making see that content) and endusers complaining about your policy covering X but not Y or your policy being inconsistently enforced or whatever.
4 years ago by ABCLAW
>ISPs themselves, (older) reddit, 4chan all serve as examples that the people you don't want to talk to can be mostly siloed off to some corner and you can have your own corner where you can have fun. Things only get problematic once you add amplification mechanisms like twitter and facebook feeds or reddit's frontpage.
This isn't true at all, and the reddit report following their ban wave is pretty clear about it; once areas that actively established a standard of violent or racist discourse as acceptable were banned, the volume of objectionable material across the site dropped.
4ch had a similar situation, where the culture on /b/, which was intentionally left as an explicitly unmoderated segment of the site, a silo, actively invaded other boards with violent, racist content.
It isn't that people sit in silos and do nothing otherwise - it's that the silos themselves cause people to believe their content is acceptable, then spread that shit everywhere.
4 years ago by ryankupyn
I think this is a really good point, and I think that if anyone is really committed to promoting free-speech-maximalist approach to the web they should be focused on building tools that make is easier for people to host and distribute their own content without relying on a centralized service.
Any business with the technical ability to censor what they host is going to be tempted (and likely pressured by other actors) to take down content that people find objectionable. Removing these "chokepoints" where a small number of people have the ability to engage in mass censorship is key if you want to promote more diverse speech on the web. (Not everyone has this goal!)
4 years ago by walrus01
a lot of people who say they want an absolute free speech drive/free speech host have never actually worked for a colocation/dedicated server/hosting ISP and seen how the sausage is made.
4 years ago by travoc
Is âseeing how the sausage is madeâ a requirement for having beliefs or opinions on the matter?
4 years ago by semitones
It's a real problem. It's easier to suppress such content, but the problem is, it just goes elsewhere where it is almost completely unchecked, and it just proliferates in much darker circles as a result, and we have even less exposure as to its true volume.
Maybe there should be more of an effort to reduce peoples' incentive to engage in that sort of behavior in the first place. Why do people join violent extremist groups? Why do people engage with CP? Why do terrorist groups exist? Is it just human nature? Is it a fact that with 7+ billion people we are destined to have millions of people engage in this behavior?
De-platforming horrible material is better than nothing, but it feels like whack-a-mole
4 years ago by undefined
4 years ago by commandlinefan
> Google's increasing role as an arbiter of right vs wrong
The problem is that (I doubt) Google is really doing this out of some misguided attempt at "protecting" people but rather as a reaction to what they perceive to be what the people want. When America was a very religious (Christian) country, media distributors stayed away from anything that appeared "blasphemous". They didn't necessarily do it because there was a law against it (there were some odd laws here and there, but the media didn't start actually challenging them until religion really fell out of favor), but because they were afraid of consumer reactions. Google (and every other tech company) is doing essentially the same thing here: speaking ill on certain topics is modern-day heresy and they just don't want to be attached to it because they do ultimately fear the consumer.
Even if you found a globally adoptable alternative to google, the same people who pushed Google to ban distribution of "misleading content" would start looking for ways to ban your globally adoptable alternative - at the network level if necessary (look what happened to Parler before they agreed to follow the unwritten rules). At the end of the day, we won't have truly free speech because far too few of us really want truly free speech.
4 years ago by unyttigfjelltol
>speaking ill on certain topics is modern-day heresy
The AUP would be more transparent if it simply banned "modern-day heresy". Folks would then be tagged as tech-heretics, and many would wear that badge with honor.
Calling questionable, unproven, unpopular or ambiguous information "misleading"-- it's a doublespeak. Worse, having my cloud drive spontaneously dumping or blocking my data because some algorithm or faceless reviewer disagrees with the content-- that's totally unacceptable as a consumer proposition. Is my Android phone next simply because I'm posting an HN comment Google might disagree with? Seriously, it's completely unworkable from a consumer position for Google to arrogate to themselves that power.
4 years ago by themacguffinman
That's not how heresy works. People don't point fingers at you and hiss "heretic!", instead they judge you and think you're a terrible human being who does terrible things so they shouldn't help or associate with you. You can't "simply ban" heresy.
If you're genuinely interested in convincing people across the aisle to stop trying to ban stuff like this, simply yelling that it's "totally unacceptable as a consumer proposition" and "completely unworkable" and "doublespeak" is barely an argument. Evidently, many consumers are accepting it and will continue to accept it.
4 years ago by semitones
"At the end of the day, we won't have truly free speech because far too few of us really want truly free speech." - that's a terrifying prospect.
I agree, I don't think Google is trying to "protect" people. They are ultimately, almost always, protecting their pockets.
4 years ago by robbrown451
In a sense that may be true, but how is that different from a bar owner asking people to leave if they are causing a disturbance by being confrontational with other patrons? The bar owner doesn't want a fight that can damage property. He doesn't want people to avoid the bar because someone is picking fights, when most of his clientele just want to kick back and socialize with their friends.
Maybe the bar owner isn't trying to "protect" the rest of his customers, he's just trying to maintain a profitable business. Protecting his pockets.
You could apply the same logic to dang, who helps keep HN a pleasant place by doing the same things the bar owner is doing. Yes I imagine he is paid a salary, by the management of YCombinator who see HN as one part of their strategy to make a profit, and therefore his motives are equally cynical.
Ok. Honestly, you can probably reduce all human behavior to such simplistic motives if you want. What I see as someone being kind, you might see as a purely Darwinian strategy to get their genes in future generations.
I'm not all that sure that is a helpful perspective, at least not most of the time.
4 years ago by driverdan
> America was a very religious (Christian) country
America is still a very religious country. It's not as bad as it used to be but it's still pretty bad.
Don't forget that these companies are global. Your example is still happening with pictures of Muhammad. Many companies refuse to host or show them for fear of offending Islamic extremists.
4 years ago by ksec
Oh Thank You. That is an interesting take I haven't thought about.
For those us not from US, it this "as a reaction to what they perceive to be what the people want." really represent the majority as in your example when America was very religious?
Because it seems to me, ( and I know zip about US ) this action only please half and anger another half?
4 years ago by amanaplanacanal
Given the amount of information Google has about its users I feel certain that they know exactly what percentage will be angered by this and what percentage will either applaud it or just not care.
4 years ago by efsavage
I think there's a vacuum here in that society wants someone to intervene when Bad Things Happen, but we either can't agree who that should be or (more likely IMO) the right choice of person/organization just doesn't exist. So you end up with some people/organizations/governments stepping up to increase their power and/or protect their own interests.
I think this is why Zuckerberg and some other big players have called for laws to regulate these things, which seems counterintuitive, but then FB can pass the buck and is more likely to maintain the status quo where they're on top. But until they are more insulated from the risks, they're going to be forced to defend themselves.
Disclaimer: I too work for Google
4 years ago by semitones
This is also probably why Google is advocating for the privacy sandbox and banning third-party-cookies, and staying ahead of the law tech-wise. Such that when the inevitable regulation of the playing field does come, they are sharing drinks and chuckling with the referees, while the other players are still struggling to figure out what their game plan is.
4 years ago by charwalker
It's also easier for FB/etc to push for laws to be written when they can pour millions into a PAC to get their ideal language into those bills, if not straight up write sections themselves. They can lobby for fines that are lower than profit from acting in bad faith or anti-competitively (who even knows how much money FB saved by buying out Instagram/etc), they can run their own disinformation or targeted campaigns to sway public opinion, or simply minimize anything on their platform to hide it from users. There's a massive power imbalance there between a regular voter and Zuckerberg/etc, even an imbalance between a regular voters who can or cannot vote early or by mail.
I support regulating these groups but that must be done within the right assigned via the constitution, existing precedent where available, and in depth knowledge of how these companies operate and how the tech influences consumers. It's complicated.
4 years ago by lettergram
Interesting... so I wrote a wrote an article on gun violence
https://austingwalters.com/firearms-by-the-numbers/
Guess it contradicts the official position of the current administration (not reality). So will my drive content be removed?
Similarly, I have been monitoring the CDC change the covid19 deaths rate over time.
https://austingwalters.com/changes-in-the-cdc-counts-of-deat...
It appears the CDC had been inaccurately portraying the deaths rate(s) (I assume unintentionally). Particularly, it appears there's a significant number of unexplained deaths. That could be "misleading?" because I regularly collaborate with and we update the data.
"misleading" does not mean not inaccurate. Often the context matters and how is Google going to take this into account? We have multiple theories and discuss them, find more information and put it together.
4 years ago by tablespoon
> Guess it contradicts the official position of the current administration (not reality). So will my drive content be removed?
Below is the actual policy. What term do you think your blog post violates?
> Do not distribute content that deceives, misleads, or confuses users. This includes:
> Misleading content related to civic and democratic processes: Content that is demonstrably false and could significantly undermine participation or trust in civic or democratic processes. This includes information about public voting procedures, political candidate eligibility based on age / birthplace, election results, or census participation that contradicts official government records. It also includes incorrect claims that a political figure or government official has died, been involved in an accident, or is suffering from a sudden serious illness.
> Misleading content related to harmful health practices: Misleading health or medical content that promotes or encourages others to engage in practices that may lead to serious physical or emotional harm to individuals, or serious public health harm.
> Manipulated media: Media that has been technically manipulated or doctored in a way that misleads users and may pose a serious risk of egregious harm.
> Misleading content may be allowed in an educational, documentary, scientific, or artistic context, but please be mindful to provide enough information to help people understand this context. In some cases, no amount of context will allow this content to remain on our platforms.
4 years ago by Jiro
I can see how information that contradicts the CDC falls under "Misleading content related to harmful health practices: Misleading health or medical content that promotes or encourages others to engage in practices that may lead to serious physical or emotional harm to individuals, or serious public health harm."
Of course, it wouldn't actually fall under anything since it's not misleading, but such things get interpreted by social media censorship boards as misleading.
Also, the gun violence one may fall under "serious public health harm". There have been plenty of attempts to control guns using public health claims. https://www.apha.org/topics-and-issues/gun-violence
4 years ago by tablespoon
Both interpretations seem like mighty big stretches to find this particular content as noncomplaiant with that policy. IMHO, if the policy gets stretched like that, then the issue isn't with the policy itself.
I mean, you could make similar stretches to hypothetically ban discussion of tax increases, because of the serious emotional harm that would cause to wealthy people fearing the loss of their money.
Honestly, the only issue with the actual text that I see is the reference to "emotional harm," given how subjective that is and how certain ideological propositions can be medicalized via that route. The rest of it is very reasonable, especially the paragraphs about civic processes and manipulated media.
4 years ago by undefined
4 years ago by lettergram
According to the current administration gun violence is a public health crisis:
https://efsgv.org/learn/learn-more-about-gun-violence/public...
I present arguments that could be considered "misleading" based on the administrations official position. Personally, I'd like to actually fix the issues, to do so, we need to discuss the issues. With that, I wrote something we can use as a framework to discuss the issues.
----
I simply deep dive into the data and found interesting results that differ (this is just a random selection):
(1) There doesn't appear to be a correlation between firearm access and homicides (if anything it's slightly reverse) https://austingwalters.com/firearms-by-the-numbers/#Firearms...
(2) White, Hispanic, Asian populations have one of the lowest firearm homicide rates in the world. In contrast, the black population has one of the highest firearm homicide rates are very high, which pushes the U.S. average up. (which arguably could support the systemic racism theory, but is a fact) https://austingwalters.com/firearms-by-the-numbers/#Comparin...
(3) The CDC & FBI crime statistics show that <0.5% of the population is murdered by firearms in a given year (~1-1.5% if you include suicides).
(4) Self-defense homicides are included in the data
(5) Gangs don't appear to be the reason for a high firearm homicide rate https://austingwalters.com/firearms-by-the-numbers/#Gang_Dem...
(6) Homicides per firearm are very low https://austingwalters.com/firearms-by-the-numbers/#Homicide...
(7) You're more likely to be beat to death or stabbed than shot (arguably guns would save you from this) https://austingwalters.com/firearms-by-the-numbers/#_Circums...
4 years ago by hxjemzbskwkxb
Sorry but your content is misleading. Take for example the following quote, from Amnesty International, which cite in your article:
> governments [with] poor regulation of the possession and use of guns lead to violence and that they must tackle this now through strict controls on guns and effective interventions in communities suffering high levels of gun violence.
From this say the following:
> The key statement is:
Guns lead to violence
The statement above implies a couple of things:
1. Gun volume and violence are correlated
2. As the number of guns increase, violence increases
âââââ
This is a blatant distortion of what that quote from Amnesty is saying.
That are clearly saying that _poor regulation of the possession and use of guns_ leads to violence.
You are quite obviously engaging in bad faith arguments.
edit: formatting
4 years ago by CheezeIt
> It also includes incorrect claims that a political figure or government official has died, been involved in an accident, or is suffering from a sudden serious illness.
In other words, it bans correct claims that Hillary Clinton had health problems, and that Joe Biden has dementia.
4 years ago by tablespoon
>> It also includes incorrect claims that a political figure or government official has died, been involved in an accident, or is suffering from a sudden serious illness.
> In other words, it bans correct claims...that Joe Biden has dementia.
Do you have a source for that that isn't blatant uninformed speculation, hopes and wishes, or a doctored video [1]? Preferably one that is well known and has enough credibility to not be banned from Wikipedia.
Also, which of these is dementia?
1. death
2. an accident
3. a sudden, serious illness
[1] IIRC, during the election there was one pushed by the Trump campaign that was misleadingly edited and slowed down to create the false impression that Biden had dementia.
4 years ago by shrimpx
Joe Biden has dementia?
4 years ago by Pick-A-Hill2019
Ahh Hell No!
I mean - Who gets to define 'misleading and/or confusing'? Google? A court case?
If they (Google) want to impose restrictions such as "Do not distribute content that deceives, misleads, or confuses users" - Might I suggest that they apply those very same standards to their own behavior.
* But who can watch the watchmen? *
I am grateful that I have a symmetric ftth connection. Information (correct or incorrect) needs to be free - as in Free to be expressed; Free to be ridiculed; Free to be disseminated; Free to be discussed; Free to be exposed to the light of day.
It is for the people themselves to decide what they do or do not believe.
Do I want Google to decide what is or is not misleading or confusing? Eh??? Say Whhaaatttt!!! Ever read a contracts terms and conditions? They can be as confusing AF⌠so uhhmm Yep â Letâs Ban âem! Woot!
Google, Eat Your Own Dogfood.
4 years ago by s3r3nity
> Who gets to define 'misleading and/or confusing'? Google? A court case?
They claim that anything "misleading" is "includes information [...] that contradicts official government records."
This is a wildly Orwellian way to dictate "truth" : it's whatever the government body says is true. (1984 _literally_ has a "Ministry of Truth")
Authorities can be wrong, and can _themselves_ be incentivized to mislead. Why place the center on them vs. individual responsibility?
Helping people understand, weigh, and index information and sources is an important problem - but solving it in this way is _absolutely not_ the right way to do it.
EDIT: spelling & grammar
4 years ago by fulafel
That's not correct, the text with more context is:
> This includes information about public voting procedures, political candidate eligibility based on age / birthplace, election results, or census participation that contradicts official government records.
There is no wider applicability of "government records" specified.
4 years ago by inglor_cz
So, why is the government considered infallible precisely on those topics, if it is tacitly understood that it can actually make mistakes elsewhere?
Also, the formulation feels strangely US-centric, even though people use Google Drive all over the world and the formulation does not explicitly say which governments are considered as infallible in their census records. With no qualification, "government" can well be government of Iraq, Hongkong or Belarus.
4 years ago by floren
> I am grateful that I have a symmetric ftth connection. Information (correct or incorrect) needs to be free - as in Free to be expressed; Free to be ridiculed; Free to be disseminated; Free to be discussed; Free to be exposed to the light of day.
You have a symmetric FTTH connection for now. Host something somebody considers too offensive and they'll get your connection pulled. We don't hear about this because essentially nobody hosts their own stuff at home, but all it would take is identifying the ASN of your IP address and making a sufficiently loud noise on Twitter.
4 years ago by Pick-A-Hill2019
You are absolutely right. If I ran un-encrypted file sharing or torrented something that matched a hash somewhere in some system, yes, absolutely my connection would be shut-off. While I could debate the rights and wrongs of that I take the cowards approach and just shuffle random bits of bytes backwards and forwards.
4 years ago by shadowgovt
I think Google would agree with you. If you're going to share information that (in Google's perception) is wildly misleading, they'd prefer you do it off a domain that doesn't have "Google" in its path. And all of us are free to do that.
4 years ago by Pick-A-Hill2019
That is a very valid point and I'm sorry you are seeing downvotes on your comment (so have an upvote from me).
The "My Server, My Rules" point of view is absolutely 100% valid ( and also why I commented about asymmetrical ftth).
But, What I am also saying is this -
At what point did the Google that was the 'Do No Evil' version pivot in to a position such as they are currently taking?
When did the ethos of 'Let's do good things for good reasons for the good of humanity' pivot to what is potentially a Section 230 nightmare?
To repeat my question - Who judges (and adjudicates) what is or is not misleading or confusing?
4 years ago by shadowgovt
I think Google (and the rest of FAANG) is wrestling with the uncomfortable possibilty that they are pawns in several nation-states' disinfo campaigns and they suspect their previous lack of intervention and professional having-no-opinion on questions of fact made everything worse.
It's possible that "Don't be evil" means "exercise control over the things you create." Frankenstein's monster wasn't created evil... It learned cruelty after its creator abdicated responsibility for it and it was exposed nakedly to a cruel world.
4 years ago by avivo
Google can do a much better job of making clear what is out of scope. When I first saw this headline I was also surprised and outraged.
Looking more closely, this is just about distribution, probably in terms of "content hosting". It doesn't target individuals or families storing whatever they want for themselves. This made more sense in that context.
For example, if I create a fake video urging people to vote illegally, or at the wrong time, and I am sharing it through Google Drive with many many people seeing it, Google wants a policy prevent that sharing.
Otherwise either it's hands are tied or it's just doing arbitrary things. Which is far more authoritarian.
If a document is shared and accessed by thousands of people, it makes plausible sense that Google might not want to essentially be a hosting service if that content is leading to real-world harm.
...but this has not been made explicit enough for such a sensitive issue, with real speech and free expression concerns. (and there are real concerns as always about who decides what is misinformation)
4 years ago by WalterBright
> if that content is leading to real-world harm
The truth can also lead to real-world harm.
4 years ago by wolverine876
We can't make any distinction at all?
4 years ago by tomcam
Serious here, not trying to be combative. How about sticking to what's legal under the First Amendment of the Constitution?
4 years ago by rscoots
>For example, if I create a fake video urging people to vote illegally, or at the wrong time
Just so you know, this is a federal crime so Google is legally required to remove such content. So that specific example doesn't really apply.
Daily digest email
Get a daily email with the the top stories from Hacker News. No spam, unsubscribe at any time.