4 hours ago by semitones
The road to hell is paved with "good intentions". We desperately need to find a globally adoptable alternative to google and the services that it provides. Docs, Sheets, Drive, etc. are fantastic services in that they work really well on a massive scale. However, Google's increasing role as an arbiter of right vs wrong and a steward of information puts too much power into the hands of one corporation, whose best interests are provably not aligned with that of the general population.
I've been working as a SWE at google (in ads...) for over two years and I've really started to loathe it over the past year. The pay is fantastic and it's really hard to walk away from that, but the idea that they are not (or at least no longer) contributing to the better world that I think we need, has started to weigh heavier and heavier on me...
We should be able to implement services like these, that are free of ads, on globally distributed infrastructure, with no central authority, to have truly free-flowing information.
edit: added quotes around "good intentions"
2 hours ago by izacus
I think your fundamental error is in the fact that you think that a private company (and market competition) can fix these issues. It seems that many people on HN are just waiting for the new Savior company, that will magically have incentives to fight for them instead of making money. It's like hoping for market competition create health regulation in the food industry.
Turns out, not even Apple is that messiah, and perhaps the solution isn't in demanding private companies to be your regulators and defenders of good morals and truth. What happened to having specialized agencies regulate and inspect industries?
2 hours ago by duxup
The other truth is we're all outraged when these companies host some stuff that we don't like ... and get upset when they don't host the stuff we do like.
Consumers aren't rational. Neither are their demands.
I'm probably no more rational than anyone else, but I'm honest that I sure as hell don't want to give money to a service that is happy to host some violent folks content / garbage...
2 hours ago by throwaways885
> The other truth is we're all outraged when these companies host some stuff that we don't like
Please, speak for yourself. I think these companies should host absolutely everything[0]. Between 2010-2016 was the golden age for these companies actually being free and open.
Edit: Within the law. To be honest, there is very little I see that should be censored beyond CP.
2 hours ago by ceilingcorner
> The other truth is we're all outraged when these companies host some stuff that we don't like ... and get upset when they don't host the stuff we do like.
Is this actually true? I only think certain fringe Twitter groups are mad that companies host controversial things.
42 minutes ago by logifail
> we're all outraged when these companies host some stuff that we don't like
I'd argue that there is already a (fairly) tried and tested process in place to deal with this, it's the legal system.
There are plenty of media outlets that publish stuff I don't particularly like, but almost none of it is illegal, so - to be blunt - I just have to suck it up.
Some of my friends have opinions that I - at times - violently disagree with, but I file that under one of the side effects of life, and I deal with it.
I'm rarely "outraged" by companies hosting stuff. If it's illegal, knock yourself out and get it taken down.
However if it's just really, really annoying or you find it against your own worldview, perhaps take a deep breath / drink a cup of tea* / go to the gym / hug your OH, and move on to something more important?
* or gin :)
an hour ago by LudwigNagasena
I am absolutely happy if a company is ready to host everything legal.
34 minutes ago by ksec
>Turns out, not even Apple is that messiah
They were, just no longer the same under Tim Cook.
I dont want to derail the discussion into another political debate but my thesis, is that some ideology spread like plague in Silicon Valley. The Good vs Evil. As the OP said Google stated off being good, but somewhere along the line the definition of Good got twisted a little bit. They keep thinking they were so righteous they literally started a crusade or witch-hunt ( so to speak ).
And it is in some way interesting because it rhymes with many historical events.
14 minutes ago by mgraczyk
I would much rather live with censorship that can be removed by withholding dollars vs one that requires votes.
Just look at the puritanical rules concerning language and sexuality forced on broadcasters. Those rules are decades old and outdated and will likely remain forever.
When companies fuck up on censorship, the results seem to last only a few years. When governments fuck up, the consequences echo for multiple lifetimes.
2 hours ago by JasonFruit
What makes you think the government is that Messiah? Is it likely, in your view, that the government will go out of their way to "encourage the spread of misinformation"? I'm not seeing that happening.
2 hours ago by bdcravens
I don't think izacus suggested it was. My read: there is no messiah, and a system of checks and balances is how we protect the public interest.
2 hours ago by duxup
>What makes you think the government is that Messiah?
I didn't get that impression from that post...
2 hours ago by paulluuk
In the EU, governments and actually trying very hard to discourage the spread of misinformation, as well as passing legislation that the tech sector has always claimed was not needed. So yes, it's very likely, as it's already happening.
2 hours ago by hash872
The issue is that we know from experience, after 20+ years of the modern Internet, that if you make a 'free speech' drive/repository place that's widely available, it will host the absolute worst of the human race. Then, let's say you personally were in charge of said Free Speech Drive- every day you'd get up and hear about people using it for (legal) jailbait photos, Islamic State recruiting, collaboration between extremist militia groups in various countries (including your own), actual illegal content, and so on. Pretty soon the FBI & CIA start contacting you about some of the actual or borderline illegal content being hosted on Free Speech Drive. Do you want to deal with that?
For one thing, it's easy to say 'well we'd only take down illegal content'. But in practice there isn't such a bright line, there's lots of borderline stuff, authorities could rule something posted on your site illegal after the fact- lots of these situations are up to a prosecutor's judgement call. Would you risk jail to push the boundaries? Coordinating 1/6 wasn't necessarily illegal until- it was.
If Islamic State is recruiting on Free Speech Drive, posting manifestos, encouraging Western residents to actual jihad- you wouldn't take that down? You'd leave it up if it hewed up to the line of being legal- really? Jailbait or non-nude pics of someone's teenage daughter, hosted in the thousands- you wouldn't take that down? It's easy to be an absolutist in an Internet argument, it's much harder when you face the sort of everyday content moderation issues you see in the real world
30 minutes ago by the8472
> That if you make a 'free speech' drive/repository place that's widely available, it will host the absolute worst of the human race.
That's only due to selection effects. If being open were the default then they'd be diluted among all the other people. ISPs themselves, (older) reddit, 4chan all serve as examples that the people you don't want to talk to can be mostly siloed off to some corner and you can have your own corner where you can have fun. Things only get problematic once you add amplification mechanisms like twitter and facebook feeds or reddit's frontpage.
> For one thing, it's easy to say 'well we'd only take down illegal content'. But in practice there isn't such a bright line, there's lots of borderline stuff, authorities could rule something posted on your site illegal after the fact- lots of these situations are up to a prosecutor's judgement call. Would you risk jail to push the boundaries?
I don't see how that's an issue? They send a court order, you take down the content is a perfectly reasonable default procedure. For some categories of content there already exist specific laws which require takedown on notification without a court order, which exactly depends on jurisdiction of course, in most places that would be at least copyright takedowns and child porn.
> Pretty soon the FBI & CIA start contacting you about some of the actual or borderline illegal content being hosted on Free Speech Drive. Do you want to deal with that?
That's pretty much what telcos have to deal with for example. Supposedly 4chan also gets requests from the FBI every now and then. It may be a nuisance, but not some insurmountable obstacle. For big players this shouldn't be an issue and smaller ones will fly under the radar most of the time anyway.
Also, having stricter policies doesn't make those problems go away. People will still post illegal content, but now in addition to dealing with the FBI you also need to deal with moderation policies, psychiatrists for your traumatized moderators (which you're making see that content) and endusers complaining about your policy covering X but not Y or your policy being inconsistently enforced or whatever.
an hour ago by kmeisthax
Another wrinkle in all of this is that you can use free speech as a form of censorship.
For example, if someone says something you don't like, you can intimidate them into shutting up by, say, threatening to reveal their personal information, such as their legal identity, address of residence, and so on. On certain corners of the Internet, merely dropping dox is good enough to get randos (who won't even be affiliated with you, so +1 to plausible deniability) to harass someone you want to shut up.
A more technical variant of this is DDoS attacks. Instead of trying to intimidate someone into shutting up with threats of stochastic terrorism; you shout over them by sending a bunch of traffic to their site until the server crashes or they run out of money.
So even if you're a hardcore free speech extremist, you still need to embrace some level of "censoring the censors" if you want the Internet to actually be usable.
an hour ago by native_samples
That's not censorship though. Threats are what people are forced to do when they cannot censor you, as censorship is much more direct. And DDoS attacks aren't speech.
an hour ago by hash872
Agreed. That's not even getting into just pure spam, which from people like Alex Stamos I've heard is 100-1000x the issue that culture war content moderation is. Once you've accepted that a platform can remove the kind of spam that killed MySpace- or doxing or a DDoS attack, as you say- you're already on the (common sense IMO) road to content moderation. Which again, from 25+ years of the modern Internet, we know is just mandatory to have a useable site
an hour ago by semitones
It's a real problem. It's easier to suppress such content, but the problem is, it just goes elsewhere where it is almost completely unchecked, and it just proliferates in much darker circles as a result, and we have even less exposure as to its true volume.
Maybe there should be more of an effort to reduce peoples' incentive to engage in that sort of behavior in the first place. Why do people join violent extremist groups? Why do people engage with CP? Why do terrorist groups exist? Is it just human nature? Is it a fact that with 7+ billion people we are destined to have millions of people engage in this behavior?
De-platforming horrible material is better than nothing, but it feels like whack-a-mole
2 hours ago by walrus01
a lot of people who say they want an absolute free speech drive/free speech host have never actually worked for a colocation/dedicated server/hosting ISP and seen how the sausage is made.
an hour ago by travoc
Is āseeing how the sausage is madeā a requirement for having beliefs or opinions on the matter?
a minute ago by 13years
> contributing to the better world that I think we need, has started to weigh heavier and heavier on me...
This is the problem. We have arrived where we are because some group of people think that powerful companies "should make the world better".
Historically, much of the major atrocities were carried out with these very intentions.
How to build a tyrant in one simple lesson: 1) Take any normal person who wants to make the world better 2) Give them the power to do so
2 hours ago by efsavage
I think there's a vacuum here in that society wants someone to intervene when Bad Things Happen, but we either can't agree who that should be or (more likely IMO) the right choice of person/organization just doesn't exist. So you end up with some people/organizations/governments stepping up to increase their power and/or protect their own interests.
I think this is why Zuckerberg and some other big players have called for laws to regulate these things, which seems counterintuitive, but then FB can pass the buck and is more likely to maintain the status quo where they're on top. But until they are more insulated from the risks, they're going to be forced to defend themselves.
Disclaimer: I too work for Google
2 hours ago by semitones
This is also probably why Google is advocating for the privacy sandbox and banning third-party-cookies, and staying ahead of the law tech-wise. Such that when the inevitable regulation of the playing field does come, they are sharing drinks and chuckling with the referees, while the other players are still struggling to figure out what their game plan is.
2 hours ago by charwalker
It's also easier for FB/etc to push for laws to be written when they can pour millions into a PAC to get their ideal language into those bills, if not straight up write sections themselves. They can lobby for fines that are lower than profit from acting in bad faith or anti-competitively (who even knows how much money FB saved by buying out Instagram/etc), they can run their own disinformation or targeted campaigns to sway public opinion, or simply minimize anything on their platform to hide it from users. There's a massive power imbalance there between a regular voter and Zuckerberg/etc, even an imbalance between a regular voters who can or cannot vote early or by mail.
I support regulating these groups but that must be done within the right assigned via the constitution, existing precedent where available, and in depth knowledge of how these companies operate and how the tech influences consumers. It's complicated.
43 minutes ago by crazygringo
I see a lot of comments misinterpreting this.
First, it's not about private files, it's about distributing content.
Google isn't spying on your private files, but does scan them when you share them publicly. E.g. keep all the pirated movies you want on your Drive, and even give private access to friends, but the moment you make them publicly viewable Google scans them and limits access accordingly. So no, this isn't applying to your private diary or privately shared documents.
And second, to those who claim absolute free speech with no limits -- notice that the two main categories here are related to democracy and health. All our legal protections ultimately depend on a democratic foundation -- undo that with misinformation and you don't have anything anymore. Similarly, your rights don't matter much if you're dead. Companies aren't allowed to advertise rat poison as medicine and neither are you.
15 minutes ago by contravariant
There's something fundamentally flawed about the idea that censorship in the name of preventing misinformation is protecting the foundation of democracy.
You cannot have true democracy if people cannot disagree with their governments, they must be able to disagree with any truth or opinion such a government might consider self-evident, just on the off chance they're right.
I should at this point note that Google doesn't directly claim to go quite that far in preventing misinformation, they mostly claim to disallow things that could harm the democractic process (e.g. telling people to vote at the wrong place, their candidate has died, etc.). At least that kind of information is usually agreed upon (if not there are bigger problems than mere misinformation), though they seem to try to include claims of voter-fraud, which is a bit dangerous.
5 minutes ago by hackererror404
Imagine if Britain had this same technology when the USA was founded... It of course would have quickly cracked down on communications and it would have done so in the name of "peace" and "what's right"...
This idea that thinking critically of a government and even believing that perhaps the government as it stands today is not the government "of and for the people" (sure could be interpreted as anti-democracy by that same corrupt government)... And maybe that's not correct, but who is the government to say that we can or cannot challenge them in public discourse as it is supposedly protected under the first amendment?
This is indeed an insanely slippery slope and people willing to trade their freedoms because they think it's for the ultimate good, I think are really making a mistake... it's not difficult to understand that this is one of the first steps of an actual fundamentally corrupt government... This is easily open to abuse and vast interpretation.
27 minutes ago by freedomben
what about if you share a file with select people? Is that still "private" or does it become public the moment you give someone else access?
29 minutes ago by gjs278
I only get two comments a day and im going to spend the next one on you as well. the rest of the world would be better off if you were dead because you are leading us down a path to actual hell. let people write whatever the fuck they want you fucking fascist prick. not everyone knows how to make a website and google drive might be how they communicate their information. itās the equivalent to locking up a guy standing on venice beach with a megaphone because he sounds crazy, and he probably is, but heās got the right to do it. google drive is a common carrier and as long as theyāre not violating the laws of the country itās being distributed in, they should fuck off and not worry about it.
34 minutes ago by gjs278
the rest of us were fine before google did this drive ban. how about they just put a warning on it for morons like you and let the rest of us read it?
4 hours ago by jonnycomputer
What. The. Hell.
I suppose limiting sharing might be justified. But limiting access and/or deleting own files is outrageous.
For that matter, collections of misleading content can have legitimate purposes. Such as research.
Now I'm seriously considering dropping my Drive subscription.
Update: I got a lot of upvotes, but tbh I may have misunderstood the new policy, which does seem to (maybe?) only be limited to distribution of misleading content. I do think Google has a legitimate interest in regulating use of it's services as means of distributing information. I'm not sure where the line is, though. For example, if I sent an email to a friend in which I said something that isn't true, I think I would rightly be upset if Google refused to deliver the email. OTOH, if I was sending this to large numbers of people regularly, as part of some kind of misinformation operation, then maybe blocking me would be a legitimate. Complicated.
an hour ago by dfdz
A recent high profile example is the lab leak "conspiracy theory". Facebook block discussion of the possibility that Covid-19 started from a lab leak, but now scientists are seriously considering the issue.
Imagine if Google had this policy last year, and a scientist posted a Google Slides presentation about the merits of the lab theory (and imagine it was banned under the misleading "health" information umbrella) ....
Disclaimer: I use dropbox (mostly since they have supported linux since when I started using the service, and when I started using the service, the syncing was much better than google drive) But now I have another reason!
30 minutes ago by zarkov99
Something similar that is happening now is the Ivermectin discussion, which right now, to some sounds as crazy as the lab leak, to some, sounded a few months ago. So we have learned nothing from the lab leak debacle.
24 minutes ago by rtkwe
I remember the original theory going round that was getting removed being that it was a lab leak of a bioweapon not the current theory that it was more normal 'gain of function' research that got out. Those are two very different accusations.
an hour ago by Latty
There is a distinction between discussing a theory, and presenting it as the truth with no evidence.
Not a distinction I imagine automated moderation systems are going to manage well, of course, but it exists.
13 minutes ago by seanclayton
Google will gladly distribute public PDFs that say the elimination of Jews "must necessarily be a bloody process," presented as truth with no evidence.
Why do they get to be the arbiter of what information deserves presentation as truth with no evidence, when they allow Mein Kampf to be presented as such?
This is all assuming that setting a file to public in Google Drive is considered "presenting it as the truth with no evidence."
an hour ago by pageandrew
This distinction never mattered. Any discussion of the theory besides refuting it as "debunked" was banned on Facebook.
2 hours ago by jfengel
Many of the components of the abuse page talk about "distribution", but the one about hate speech is simply "Do not engage in hate speech".
The way I read that, they could conceivably boot you just for having a private diary of racist rants. I don't know if they intend it that way; it's also possible that they'd interpret "speech" as meaning "speech where somebody else can hear you".
But if you were looking for more reason to drop your Drive subscription, that might be it.
an hour ago by Uhhrrr
Also, what if the content wasn't racist when you wrote it?
2 hours ago by BitwiseFool
The Terms of Service are written broadly and that's on purpose so that Google can take the liberty of interpreting things however they please. Ultimately, it comes down to the culture of the team that deals with content scanning and take-down requests. I'm willing to bet money that it is staffed by Progressive Bay-Area folks and the implications should be self evident.
2 hours ago by polynomial
Even more problematic is the difficulty current algorithms have in distinguishing "engaging" in a certain class of speech acts from studying said class. This requires a subtlety of parsing humans are (sometimes) capable of, but machines still struggle with.
2 hours ago by ixacto
Does this apply to google workspace (paying)customers too?
If so google is actually scanning what is supposed to be private information.
an hour ago by stingraycharles
I looked closely but I cannot find anything that says that paying customers are excluded. So I would assume it is included.
It makes sense, though, to enforce one single policy: otherwise, just becoming a paying customer would be a loophole to circumvent phishing etc policies.
I guess the crux here is when, exactly, they start scanning the content. Is that perhaps the moment the content is first shared?
It doesnāt seem to be documented.
an hour ago by read_if_gay_
> If so google is actually scanning what is supposed to be private information.
Am I just too cynical or are you naive for thinking they aren't datamining absolutely anything they can get their hands on?
an hour ago by DaniloDias
Logs are only useful if they are monitored.
This individual seems to look at events that happen that donāt align with their prejudices and assumes that the signal is an aberration.
Nihilists get burned. You have to hold people accountable for the consequences of their action rather than their intention.
an hour ago by m-p-3
If it's not end-to-end encrypted, it's not private from Google.
Google can and will scan anything you feed it.
an hour ago by ixacto
Iām tempted to make a bunch of honeypot accounts just to see how Orwellian and/or gameable their system is.
What will trigger it? Fan pages to Trump/Qanon? Antivaxxer propaganda? The German federal police paper on how to clandestinely manufacture heroin? Or maybe just a 20mb text file with the āåā character?!
an hour ago by Alex3917
> I suppose limiting sharing might be justified. But limiting access and/or deleting own files is outrageous.
It's definitely not just a hypothetical risk either. In the last month, Facebook blocked me from distributing my blog post on Django best practices because they claim it violates their community standards. It's literally a post about how to structure API code, but now they're claiming that it's advocating for genocide or something and there's apparently no one to appeal to or any way to reverse this.
For reference: https://alexkrupp.typepad.com/sensemaking/2021/06/django-for...
https://developers.facebook.com/tools/debug/?q=https%3A%2F%2...
5 minutes ago by jonnycomputer
That's crazy.
Twice in the last week a comment of mine got blocked on Facebook for violating policies. In the first case, I posted a link to an EPA website and a screenshot of a graph posted on it, showing a reduction in smog in the last 10 years compared to 1987. In the second, I pasted a screenshot of something from the CDC's COVID Tracking site and a link to a British research paper on the efficacy of the vaccines wrt the delta variant. In the second case, my link was broken somehow, which might have contributed to the problem somehow?
They never went back up.
4 hours ago by josephcsible
Their definition of misleading says it "includes information [...] that contradicts official government records". Because no government record has ever been wrong before, right?
3 hours ago by markzzerella
Early on Fauci was telling everyone how useless masks were [1].
I have a couple people in my social circle that were banned from fb for saying he was wrong early on about masks being ineffective. And several others were banned later for pointing out fauci's earlier stance and calling it all propaganda. You are not allowed to think for yourself. Pick up that can.
35 minutes ago by YeBanKo
"We have recently been notified of a potential policy violation and after a thorough review of the video materials uploaded, it has been determined that the content is misleading and contradicts official government records. As a result of this decision, you access terminating to Youtube and other Google products, including Gmail, has been terminated. The decisions is final. This message is auto-generated"
āThe past was alterable. The past never had been altered. Oceania was at war with Eastasia. Oceania had always been at war with Eastasia.ā
an hour ago by read_if_gay_
Lab leak hypothesis.
3 hours ago by deregulateMed
N95 masks were never useless and Fauci should be seen as a monster.
What I don't understand is why there were no nations giving away n95 masks and education on how to wear it. It seems like we hivemind to the unsustainable lockdowns.
2 hours ago by merlinscholz
Last year in Germany you could get IIRC 7 free N95/FFP2 masks at the local pharmacy for free. The pharmacists were supposed to show you how to wear them.
2 hours ago by djrogers
> What I don't understand is why there were no nations giving away n95 masks
Don't know if you remember last march/April, but there were no N95 masks to give out in a lot of places...
4 hours ago by gjsman-1000
A. Which government? All governments in general collectively? Or just the governments you want to believe? I'm sure they aren't going to listen to the COVID skeptic President of Brazil, right?
Even better, as a user, can I appeal to the authority of a different government than the one I live in? Let's say I appeal to the authority of Brazil as governing my content even though I live in the US. How does that work?
B. Because no government has ever lied on official records when there is a disaster. For sure. And no government is currently, right now, lying on their records to save face. For sure.
2 hours ago by zpeti
Well, we only have to go back a year, and you are saying the Trump admin is the only one speaking truthā¦
I find it insane that Silicon Valley companies have such short memories, and canāt even comprehend that 1 year ago the same policies would have resulted in banning anything Trump disagreed with.
40 minutes ago by native_samples
The "policy" is merely a fig leaf to disguise untrammelled totalitarianism and arbitrary abuses of power.
It's sad. I used to work there. What monster did we create, exactly.
3 hours ago by Negitivefrags
Itās amazing to see an authoritarian regime establish itself in real time in front of our eyes.
Itās so easy to look at the past and think āHow did people let this happen?ā.
And yet here we are.
I hope stopping the anti-vaxers and election fraud people is worth it.
3 hours ago by cpr
Especially as the news about the election fraud slowly rolls out across multiple states...
an hour ago by defaultname
Did I accidentally stumble into The_Donald? The bizarre, fact-free comments that dominate this discussion are simply gross for HN.
How in the world was this comment flagged, beyond the brigading of the most ignorant of deplorables. There is zero information about "election fraud" coming out, much less from multiple states, but this is the tact these horrendous cretins use to ply their disinformation noise. But did you see all of the news coming out about how the "MAGA" crew are actually lizard people with sub-50 IQs? It's true, you'll see. It's true! The Cyber Biologists did a "study" and they pointed out that in a picture the insurrectionists reflected light just so, clearly demonstrating that they must be lizard people.
It is embarrassing to see this on HN. The US is turning into a laughing stock.
4 hours ago by neuronexmachina
The full paragraph for reference:
> Misleading content related to civic and democratic processes: Content that is demonstrably false and could significantly undermine participation or trust in civic or democratic processes. This includes information about public voting procedures, political candidate eligibility based on age / birthplace, election results, or census participation that contradicts official government records. It also includes incorrect claims that a political figure or government official has died, been involved in an accident, or is suffering from a sudden serious illness.
2 hours ago by irthomasthomas
I just cancelled my drive subscription.
I don't know how anyone could continue using it after this.
I probably have dozens of docs and hundreds of research papers contradicting government health advice on diabetes and heart disease. These would fall under "Misleading content related to harmful health practices" since they promote a health theory which the government considers harmful.
However I would have cancelled regardless since the idea of automatic bans and/or content deletion based on ML models is crazy. They are obviously going to find a lot a false positives and I can't deal with the idea of trying to speak to google to explain that their algorithm mistakenly flagged my innocent content. In other words even if you are the perfect citizen, there is a chance you will get flagged anyway.
an hour ago by tyingq
Yes, this is really bizarre. I get this kind of policy for some kinds of platforms, but not Drive, Docs, Sheets, Slides, and Forms. Those are my documents, and I should be free to put whatever I want in them.
What if I just like to collect and share old conspiracy theory stuff that I know is wrong? For whimsy, historical, whatever purposes...
28 minutes ago by kderbyma
going to follow suit
an hour ago by ukie
It will change nothing. Google has too much money. Getting away from its services is the right thing to do though.
3 hours ago by Mountain_Skies
Simple solution: Google can delete whatever it wants for being "misleading content" but if they ever get it wrong, even in the slightest, 10% of Google's wealth is transferred to the aggrieved. If Google is so confident in its ability to be the sole arbitrator of truth for humanity, they should have zero issue with agreeing to this because it will be zero risk.
Of course they will never agree to such a condition because they know they aren't capable of even figuring out how to make products that the market wants much of the time. The Google Graveyard is proof of this. They sure as hell aren't capable of knowing more about virology than the virologists of the world who still are trying to figure things out.
Tech companies wanting to be in control of human ability to communicate with each other is simply a power play. Combat it with risk and they'll have to back down or suffer the consequences when they make a mistake (which they will, often).
an hour ago by judge2020
You'll find that not many are for Google being so dominant and thus being able to limit potential information (or misinformation) spread, but you'll find almost everyone being against actually taking some of Google's wealth for doing so as they're still a private company and can do whatever that they want with their services, even if that's deleting information or removing files with the letter 'x' in their file name.
an hour ago by charonn0
This might be reasonable if the aggrieved user has a right that Google violated. But that doesn't seem to be the case.
2 hours ago by ncal
āEvery record has been destroyed or falsified, every book rewritten, every picture has been repainted, every statue and street building has been renamed, every date has been altered. And the process is continuing day by day and minute by minute. History has stopped. Nothing exists except an endless present in which the Party is always right.ā -George Orwell, 1984
an hour ago by TameAntelope
The problem we actually need to solve is how to divorce the concept of free speech from private companies. For some reason, it's almost ubiquitously believed that the "Internet" is actually just Twitter, Google, Facebook.
How can we, the tech literate, start to explain to our friends/family and eventually the world, that the Internet is actually an interconnected network of computers that no one person or entity controls?
Honestly, I fear that even among us, we have large groups of uninformed people who believe companies like Google and Facebook are more than just leaf nodes on the graph, and that worries me. If knowledgable people can be this wrong, how can we expect the uninformed to get this right?
I worry about the future of tech, if the discourse on HN is any indication of how the greater tech community feels. The idea that Google is an integral part of using the Internet to express oneself is not only completely wrong, but actively harmful, and I have no clue what to do about it.
"The Net interprets censorship as damage and routes around it". John Gilmore's statement has seemingly gone out of vogue, but it remains true. Google cannot stop information (good and bad) from spreading, so why do people think they can, and how do we disabuse Americans (and the world) from the notion that Google has that power?
28 minutes ago by xerxesaa
Even if you are tech literate, how do you easily find an alternative to Facebook and Twitter?
The entire benefit of these platforms derives from the fact that they've accumulated so many users. People use these platforms because everyone else they know is on it.
I personally haven't used either for several years but I've had to "sacrifice" my online social presence and miss out on updates from my circle as a result. I put "sacrifice" in quotes because I think the net result is positive and though I miss having some updates, I overall feel quality of life is better without a public online presence.
23 minutes ago by TameAntelope
As you've mentioned, the notion that you need an alternative to live a healthy, happy life is untrue.
How do we explain this to the world?
Daily digest email
Get a daily email with the the top stories from Hacker News. No spam, unsubscribe at any time.