Even More DiscoMD5 Nonsense
-
We're down 1 GB from before when we were idling and running
-
I suspect there's a hard limit on the cooked post in the database; anything much larger than what I actually posted resulted in server errors and didn't post.
It seems that both raw and cooked are
text
fields in the database, which a quick Google suggests should hold upto 1GB in Postgres.
Whether or not the 1038 layers of Discourse can cope with that is different matter.
-
Well, we haven't hit it quite yet...I really want to see if we can hit that limit.
-
Is there a string of fewer than 32,000 characters that contains a hex MD5 of itself?
-
In case anyone was wondering, posting this nearly pegged a 4GB DO droplet's CPU.
-
Got it to a body size of ~6.6 MB when nginx started bombing out with "413 OK (Request entity too large)"...
WTF does it do? Send the content of the preview window?Oh god, it does. WHAT THE FUCK IS WRONG WITH YOU?
-
-
So what does it do with the .cooked data it gets from the client?
If I nuke the preview window content with dev tools and send the post, it displays just the same. Is DickSauce making the client do the "cooking" work and send it to the server just to throw it away and do its own "cooking"?(Sorry if this has been discussed before. In that case, just point me to a thread please.)
-
So what does it do with the .cooked data it gets from the client?
fuck me with a purple dildo if i know.
could be anything....
-
-
-
-
Yes. The client cooks for the preview window, and when the user clicks Reply the client's cooked post is displayed. A few Discoseconds later, the server's cooked post replaces the client's cooked post. 99% of the time they are the same and you wouldn't notice, but every once in a while you could see your post change. The Discodevs claimed that they use the same code to cook on both ends, but after we called , they admitted that the server does more filtering/sanitizing/whatever.
-
Is DickSauce making the client do the "cooking" work and send it to the server just to throw it away and do its own "cooking"?
Yes.
-
when the user clicks Reply the client's cooked post is displayed. A few Discoseconds later, the server's cooked post replaces the client's cooked post.
Still makes no sense to send the cooked post to the server. It's just being displayed client-side for a few moments while the server gets its shit together and responds.
-
Still makes
noDiscosense to send the cooked post to the server.
And that's why it happens ;)
-
Jesus, this entire thread is special.
And that's special for discourse.
It's Discospecial.
-
Got it to a body size of ~6.6 MB when nginx started bombing out with "413 OK (Request entity too large)"...
WTF does it do? Send the content of the preview window?Oh god, it does. WHAT THE FUCK IS WRONG WITH YOU?
If I nuke the preview window content with dev tools and send the post, it displays just the same.
Please tell me you tried posting larger ones by nulling out the preview window content...
-
-
Uh...
client_max_body_size
is set to10m
-
In case anyone was wondering, posting this nearly pegged a 4GB DO droplet's CPU.
Have we calculated the ratio of posted text : cooked text? 48 bytes becomes 4GB?
-
Okay, so here's where it gets REALLY, REALLY DISCOSPECIAL.
nginx
:client_max_body_size
is set to10m
(10 MB)
FUCKING DISCOURSE has a SEPARATE SETTING of max image size kb (and max attachment size kb) that MUST MATCHnginx
's setting or else it won't work. EXCEPT DISCOURSE GOES TO NO EFFORT TO KEEP THEM THE SAME. YOU HAVE TO MANUALLY SET THE DISCOSETTINGS and THEN run./launcher enter app
and modify the nginx config file by hand.But by default they don't even match anyways!??!, so DiscoHorse just chooses its (smaller) default of 3 MB and lets nginx think it's okay to accept 10MB bodies. I guess when that happens you get 500 Internal Server error.
What in the cocksucking fuck?
-
a 7.1 * 10^144 character post should be possible
Ouch. That's going to break in so many different ways.
-
It accepts them up to 10MB and then resizes them to the maximum allowed.
It has 5 attempts at resizing the image down to the allowed size too.
-
Please tell me you tried posting larger ones by nulling out the preview window content...
Tried, but the massive preview content makes dev tools practically unusable. I'll probably resort to hand-crafting the request.
NukeBot, anyone?
-
It has 5 attempts at resizing the image down to the allowed size too.
Is THAT what takes forever at 100%?!
What happens if it fails?
-
Does the increased max_body do anything for you?
-
Is THAT what takes forever at 100%?!
Presumably.
What happens if it fails?
If it hasn't made it small enough in 5 attempts, you get the "image is too large" error.
-
Can a few people join up really quickly? I want to see something with users....
-
Not really, seems like my browser is the bottleneck now.
-
Well, that's kinda close...have a for that.
-
It has 5 attempts at resizing the image down to the allowed size too.
......
just resize the fucker down to 500px wide (or whatever the site setting is) and be done with it. if it's still to big then throw an error.
-
Or just reject the file for being too large.
-
-
It has 5 attempts at resizing the image down to the allowed size too.
Am I reading that code correctly? They literally attempt the exact same resize on the exact same image five times because if it discofails the first four times it might discosucceed on the fifth. What is this? Faith-driven development?
Also magic numbers == urgh, but whatevs, this is Discourse.
-
They resize it to 80% of its original size, then if that's still too large, resize the resized image to 80%... and so on.
-
hey literally attempt the exact same resize on the exact same image five times because if it discofails the first four times it might discosucceed on the fifth.
i think that downsize function overwrites the temp file with the resized file?
-
Am I reading that code correctly? They literally attempt the exact same resize on the exact same image five times because if it discofails the first four times it might discosucceed on the fifth. What is this? Faith-driven development?
Not quite. It's downsizing the file in place, so the next time it tries to downsize it's starting with the file it already downsized.
Presumably the function will downsize the file further each time.
They resize it to 80% of its original size, then if that's still too large, resize the resized image to 80%... and so on.
Is that a size parameter? I expected it to be the JPEG compression level.
Still, passing a percent as a string? WTF.
i think that downsize function overwrites the temp file with the resized file?
It has to. That's the file it's checking each time through the loop to see if it's small enough yet.
-
They resize it to 80% of its original size, then if that's still too large, resize the resized image to 80%... and so on.
Because resizing it once to the right size make sense?
-
Not quite. It's downsizing the file in place, so the next time it tries to downsize it's starting with the file it already downsized.
Wow, well that's not at all obvious from looking at the code, and also I'm not sure that it's any better than what I suggested.
...and here comes the double-post...
EDIT: how can the body be too similar if my last 500 was an actual 500, then? Fuck this software.
-
Because resizing it once to the right size make sense?
If you're trying to reach a target file size, you really can't just resize it once to the right size. You don't know exactly how well it'll compress (and therefore, what the resulting image's file size will be) without trying it.
-
True, I guess. But still, it should always resize from the original, to preserve as much quality as possible.
-
I'd be surprised if there was a noticeable quality difference either way. I don't care enough to find out though.
-
Wow, well that's not at all obvious from looking at the code
It's passing the same filename twice; what else would it be doing?
OptimizedImage.downsize(tempfile.path, tempfile.path, "80%", allow_animation: SiteSetting.allow_animated_thumbnails)
-
it should always resize from the original, to preserve as much quality as possible.
This is . Quality is irrelevant.
-
It's passing the same filename twice; what else would it be doing?
Oh I see, so it's not only resizing images in a way that's retarded, they also picked probably the slowest and most filesystem-intensive method to do it.
-
They resize it to 80% of its original size, then if that's still too large, resize the resized image to 80%... and so on.
cinqtuple the compression artifacts!
-
Shit. I just realized two things about this MD5hit nonesense:
-
Is this going to fuck up the migration? Does this mean the migration off this fuckswing will have to do MD5 parsing? If so, does that mean the migration will need to be smrt enough to only do one level deep so it doesn't explode?
-
Does the permanently fuck up "Download My Posts" for anyone who posted a server-crashing post? (Or anyone who replied to it)? Or does DiscoDownload differentbake those files?
-
-
@Lorne_Kates said:
Does the permanently fuck up "Download My Posts" for anyone who posted a server-crashing post? (Or anyone who replied to it)? Or does DiscoDownload differentbake those files?
Just did this on discourse.element.ws (and I just resized the server back to 1GB, so it definitely chokes on those posts now). It just dumps the raw into them. My export was like 50 MB. Only took a few seconds though.
-
@Lorne_Kates said:
Does the permanently fuck up "Download My Posts" for anyone who posted a server-crashing post? (Or anyone who replied to it)? Or does DiscoDownload differentbake those files?
Just did this on discourse.element.ws (and I just resized the server back to 1GB, so it definitely chokes on those posts now). It just dumps the raw into them. My export was like 50 MB. Only took a few seconds though.
I don't know if that's for not breaking the download or for serving raw, unbaked posts so people will have to bake them themselves to see what they actually posted.