Mozilla intends to deprecate HTTP-without-TLS
-
I think the title speaks for itself, really.
-
So, the is Mozilla wants the Web to be more secure? Because that sounds like a good thing to me...
-
So, the WTF is Mozilla wants the Web to be more secure? Because that sounds like a good thing to me...
HTTPS has overhead, both for the end-user and the implementers, you need a signed certificate, and for a huge majority of the sites you don't need HTTPS because you're not transmitting any sensitive data.
Yep, it is as stupid as it sounds.
-
In order to encourage web developers to move from HTTP to HTTPS, I would
like to propose establishing a deprecation plan for HTTP without security.
Broadly speaking, this plan would entail limiting new features to secure
contexts, followed by gradually removing legacy features from insecure
contexts. Having an overall program for HTTP deprecation makes a clear
statement to the web community that the time for plaintext is over -- it
tells the world that the new web uses HTTPS, so if you want to use new
things, you need to provide security.Ok, fine, agreed. Now, what about all those old sites floating around? Bits of history written in HTML 1.0 on some poor old Solaris box in corner of a university somewhere? Will we be able to access those? Are we sure someone will move that content and keep it accessible?
There are many technical considerations here for not allowing some stuff without HTTPS. But outright deprecating and killing off HTTP? Just my simple, stupid example would probably mean we'll lose a part of the web. Hell, we're probably losing bits of it that are in the same situation daily anyway, do we really need to help it along?
Ok, yes, scream bloody murder at me when I try using fancy new technology X without HTTPS on. But leave the old stuff alone.
-
Ok, yes, scream bloody murder at me when I try using fancy new technology X without HTTPS on. But leave the old stuff alone.
Paging @apapadimoulis ...
-
So, the is Mozilla wants the Web to be more secure? Because that sounds like a good thing to me...
Remember the old adage about how to make a system more secure: Disconnected from the network, powered off, and locked in a concrete box, with the key inside.
Yes, it would be "more secure". But does it outweigh the costs?
Fun fact: my home router's configuration page is at http://192.168.1.254/. There is no way the manufacturer is going to get a certificate for 192.168.1.254.
-
There would probably still be backwards compatibility. I mean fuck, if we're allowed to just deprecate obsolete Web crap, then HTTP/HTTPS is the least of our problems.
-
I am unsure of that. You'd think so, but the tone of it reads to me as "we should get rid of this shit completely". I may be wrong, just my impression.
-
There is no way the manufacturer is going to get a certificate for 192.168.1.254.
A "solution" from the discussion:- Perhaps a build-time configuration could be enabled that would enable system
administrators to ignore the warning for certain subdomains or the RFC
1918 addresses as well as localhost. Note that carrier grade NAT in IPv4
might make the latter a bad choice by default.Because compiling Firefox is fun™!
-
Mozilla intends to deprecate HTTP-without-TLS
That's more like they're planning to make themselves completely obsolete in the eyes of users. Security people are rather good at that sort of thing. Maybe they should also prevent connections from being established unless the user has a blanket over their head and their special safety teddy bear.
-
Ooooh! Hardware token teddy bears!
-
This post is deleted!
-
< insert dongle joke here >
-
That's more like they're planning to make themselves completely obsolete in the eyes of users.
I would be very surprised if Google did not have a similar plan. And then users have very little choice, unless we're about to get another wave of Microsoft dominance? (Or maybe Apple this time, just to mix things up)
-
< insert dongle joke here >
I just wanted to make asymmetric cryptography cute and cuddly!
Then Internet happened
-
-
HTTPS has overhead, both for the end-user and the implementers, you need a signed certificate, and for a huge majority of the sites you don't need HTTPS because you're not transmitting any sensitive data.
Meh! With today's hardware and network connections, this is not an issue. And you can create certificates, only the user will have to accept them. But at least the connection will be secured. Also, who are you to define what is sensitive and whats not? Get a bunch of non-sensitive data from someone and you end up knowing a lot about that person. BTW, usually an physical address is not treated like sensitive data, but damn if it's not for me.
Ok, fine, agreed. Now, what about all those old sites floating around? Bits of history written in HTML 1.0 on some poor old Solaris box in corner of a university somewhere? Will we be able to access those? Are we sure someone will move that content and keep it accessible?
Do you know the meaning of the word "deprecate"? Here:
a archaic : to pray against (as an evil)
b : to seek to avert <deprecate the wrath … of the Roman people — Tobias Smollett>
2
: to express disapproval of
3
a : play down : make little of <speaks five languages … but deprecates this facility — Time>
b : belittle, disparageAnyway, I was thinking about this yesterday and maybe the solution would be to present web users something like they're presented when installing an Android/iOS app:
This web site uses cookies from 3rd parties (x.com, y.com, etc)
This web site uses a secure connection (HTTPS)
This web sites wants to know your locationFirefoxOS already does something like this with a manifest file.
-
Remember the grumpy cat research thing from Lorne Kates' signature?
Few, if any users, care about such security things - most people would get annoyed because they see words!
-
Yeah, it's a fine line. But it would make our (devs, ops, IT) easier. For example, right now I have to tell users how to add certain site's cookies so IE doesn't bitch about it.
-
Meh! With today's hardware and network connections, this is not an issue.
It is an issue when there's zero gain from it.
And you can create certificates, only the user will have to accept them.
And have the browser yell at you in big red letters because the cert is self-signed. No, thanks.
Also, who are you to define what is sensitive and whats not? Get a bunch of non-sensitive data from someone and you end up knowing a lot about that person.
If you're really paranoid enough to care whether someone snoops on your morning news reading, just make an encrypted tunnel. Most people don't unless it's personal info, and those forms tend to be encrypted.
-
Do you know the meaning of the word "deprecate"?
Yes. I also know that in the context of computing it's used to indicate a grace period just before something is completely removed / changed.
It's usually accompanied by
W_SPANK_SPANK_BAD_PROGRAMMER
being thrown at you, just beforeE_NO_WAI_COMPLAIN
hits you over the head in the next version.
-
the title speaks for itself
Yeah, but it would have spoke a lot louder without ‘-without-TLS’ added on like that.
There would probably still be backwards compatibility. I mean fuck, if we're allowed to just deprecate obsolete Web crap, then HTTP/HTTPS is the least of our problems.
I missed that bit at first too. Fucking sneaky bastard all smooth-talking ‘the new web uses HTTPS’, then drop that
>removing legacy features from insecure contexts
in like it's no big deal.
Anyway, no surprises that the US government is pushing Encryption Everywhere; that just means they've got all their backdoors in already, and they're trying to push home the advantage over all the other losers who won't be able to intercept everything.
-
Anyway, no surprises that the US government is pushing Encryption Everywhere; that just means they've got all their backdoors in already, and they're trying to push home the advantage over all the other losers who won't be able to intercept everything.
It could also mean they're made up of multiple organizations and groups with different goals.
But what does the US government have to do with anything? Are they pushing Mozilla to deprecate HTTP? If so, do you have a source on that?
-
It could also mean they're made up of multiple organizations and groups with different goals.
When are sheeple like you going to wake up to the real truth?
But what does the US government have to do with anything? Are they pushing Mozilla to deprecate HTTP? If so, do you have a source on that?
Third line of the thing you linked.
-
HTTPS has overhead, both for the end-user and the implementers
My personal results:
HTTPS over SPDY (where HTTP 2.0 offers similar protocol benefits): ~0.9s
Plain HTTP 1.1 : ~7.4sSo about that overhead...
-
HTTPS has overhead, both for the end-user and the implementers, you need a signed certificate, and for a huge majority of the sites you don't need HTTPS because you're not transmitting any sensitive data.
Yep, it is as stupid as it sounds.
Is it? Certs are pretty cheap, the overhead is pretty low compared to the amount of data being sent on modern websites, and a lot of people all too freely send personal details over the Web without thinking about the security implications. So I don't really see anything wrong with encouraging better security all-round.OK, no-one's going to want to pay for a cert for 192.168.x.x, but that IP range (and other private network ranges) can be assumed safe. And if they prove not to be... well, TLS isn't going to help anyway
-
OK, no-one's going to want to pay for a cert for 192.168.x.x, but that IP range (and other private network ranges) can be assumed safe.
Tell that to modern browsers. As in, set up a secure WebSocket server on a private network range using a self-signed certificate and connect to it from a browser using JS WebSocket API without manually importing the certificate. I'll wait.
-
One would hope that such support would be added to the browser
-
Or at least provide a pop-up. But nope. Unless something changed in the last few months, it just fails seemingly silently (ok, it does spew errors into the console).
I remember one of the browsers offering me the cert when I tried to access the
wss://
URL directly, so that's something?
-
So about that overhead...
What did you prove? Yes, you can mitigate the overhead by using more efficient protocols. That doesn't mean there's no overhead - if you applied the same protocols to non-encrypted traffic, the result would be even faster.
Certs are pretty cheap, the overhead is pretty low compared to the amount of data being sent on modern websites, and a lot of people all too freely send personal details over the Web without thinking about the security implications.
So let's break half the fucking Internet so that poor Joe Shmuck doesn't need to pay attention to the padlock anymore. The overhead-to-benefit ratio is infinite whenever you don't send anything worth the hassle, which is 99% of the time.
So I don't really see anything wrong with encouraging better security all-round.
Backwards compatibility? Fuck that, it's 2015, the year of the New Web!
Overhead and generally wasting gajillions of cycles and in bandwidth by sending assets via encrypted connections? Fuck that, it's 2015, we're rich and don't give a shit!
Having to get a cert for every single website you own, no matter how small or how local? Fuck that, all web is is obviously Tumblr and Facebook and Slashdot and nobody needs to care about more localized solutions!
-
You reply as if this is all going to happen overnight; if it does happen, it'll take at least a decade to even gain momentum, let alone be completed. Plus deprecated != removed.
-
http://www.httpvshttps.com/
My personal results:HTTPS over SPDY (where HTTP 2.0 offers similar protocol benefits): ~0.9sPlain HTTP 1.1 : ~7.4s
So about that overhead...
My internet speed is currently capped, and I got 77.450s on HTTP and 110.736s on HTTPS, on the latest Chrome.
So, what about that overhead?
-
You reply as if this is all going to happen overnight; if it does happen, it'll take at least a decade to even gain momentum, let alone be completed.
So? You think that means that in a decade there will actually be no HTTP websites?
Plus deprecated != removed.
Then why deprecate it if it needs to be supported anyway? People already use HTTPS where applicable, but now they have a choice of not using it where it's not applicable. And frankly, I haven't seen an unprotected login form in quite a while, with a notable exception of this site.
-
You think that means that in a decade there will actually be no HTTP websites?
I believe I said a decade to 'gain momentum'... yup, that's precisely what I said. So if you're going to challenge my point, challenge my point, not the words you're putting in my mouth unwarranted.
-
I believe I said a decade to 'gain momentum'... yup, that's precisely what I said.
Sooo... we'll have more people saying to abolish HTTP?
Look, no matter how much momentum it gains, there will still be lots and lots of legacy on the web. Even if it was a sensible breaking change. Which it isn't.
-
Sooo... we'll have more people saying to abolish insecure HTTP?
Why not? I still don't see an issue with making the Web more secure by default.
-
Why not?
Because it's not achievable anyway, unless you hunt down everyone who has ever put a site on the Internet?
-
I'll cede that point, with the caveat that it was once believed impossible to land humans on the Moon...
-
Well the point is, we haven't fixed up HTML for years now, and for good reason. What makes you think that converting everyone by force to HTTPS will be any easier than converting everyone by force to well-formed HTML5?
We keep the legacy not because we like it, we keep it because not having it breaks stuff.
-
What makes you think that converting everyone by force to HTTPS will be any easier than converting everyone by force to well-formed HTML5?
Never said it would be
-
I'm actually for this, but you have to simultaneously enable a more accessible ssl cert. Its unreasonable to deprecate http then continue charging 80usd for a cert.
-
I would be all for a warning type system, similar to what happens with bad certificates right now...
I'm actually for this, but you have to simultaneously enable a more accessible ssl cert. Its unreasonable to deprecate http then continue charging 80usd for a cert.
http://otherurlthatIknowexistsbutdon'tremember
I agree that it could be more accessible though - I think that's the point of the letsencrypt project
-
Also @OP
[quote=Mozilla"]
Broadly speaking, this plan would entail limiting new features to secure contexts, followed by gradually removing legacy features from insecure contexts.
[/quote]They're not getting rid of http, they're just depracating it. These sites would ideally be legacy sites anyways so new features don't matter.
-
Just means yet fewer people will use Firefox. If they go through with it.
-
My point is http is free. It is unreasonable to charge for ssl certs if you only allow ssl certs.
-
StartSSL is free too. As is letsencrypt, once it opens.
-
https://letsencrypt.org/
/me bookmarks this to check on in "mid-2015"
i'd love to get HTTPS active on http://servercooties.com but i don't want to pay for the cert and startssl's interface is.... probably the worst UX i have ever encountered.
-
Yeah, that's true. I'm used to it though. For what they give you (I've got a class 2 account so I've got code signing and wildcard SSL) for the price they give you (60 bucks for all that), I can get past a bad UI.
-
i'd love to get HTTPS active on http://servercooties.com
As if CORS wasn't enough, you want us to deal with mixed content nightmare as well?
WANT_NOT_FOUND
.Now, if WTDWTD moved to HTTPS, I'd be right there with you.
-
letsencrypt
Doesn't work for non-Apache, non-NGINX services (can theoretically work with IIS but no code provided and no plans to do so) and has no provision for coordination between multiple sets of service software, either for load balancing (two Apaches on two servers) or variety (HTTPS + FTPS + XMPP, for example).Otherwise good, and I hope we eventually have a similar solution that actually works.