This is the DUMBEST idea I've seen in a long time
-
Racking Mac Pros. Not the old rectangular model, the stupid dustbin ones.
To do image rendering.
Because:
Parts of our technology are built using OS X’s graphics frameworks, which offer high quality output and excellent performance.
And obviously creating expensive custom rack-mount hardware was a lot more difficult than, uh, just porting your fucking code to raw OpenGL and running it on Windows or Linux servers at 1/3rd the cost?
What. The. Fuck.
-
Not the first time something like this was done...
http://www.mk1manufacturing.com/mk1manufacturing.com/SANLink_files/shapeimage_2.png
Although your find is better.
-
That's so incredibly stupid.
Truly a wtf. +1
-
>Parts of our technology are built using OS X’s graphics frameworks, which offer high quality output and excellent performance.
putting aside the OSX stuff..... people are aware that OSX is built on top of *nix these days so unless they're idiotic enough to link to cocoa or something they can run the render code pretty much unaltered on most flavors of *nix....
right?
i mean this stuff isn't exactly a secret.
-
You're not understanding the mindset of a mac fan.
To them, osx IS unix. Unix only came to exist as part of a divine plan to realize osx.
-
I would love to see what Apple's doing (outside of the normal OpenGL library) that makes this worthwhile. I'm also having trouble imagining what would make one OpenGL produce "better quality" than another. It's the same fucking API running on the same fucking GPU.
I thought OS X just shipped OpenGL, and a relatively old version at that, and shoved some thin APIs on top of it.
EDIT: in the HackerNews comments, the guy who created this monstrosity admitted that switching to Linux/Windows servers would save 10-15%. I highly doubt the savings would be that low, but even if they were: 15% savings is WELL-WORTH a software port.
-
just porting your fucking code to raw OpenGL and running it on Windows or Linux servers at 1/3rd the cost?
And I somehow managed to miss this part of the OP, but quoted for belguiming truth.
-
You're not understanding the mindset of a mac fan.
you're not one of them are you? :frystare.qt:
I would love to see what Apple's doing (outside of the normal OpenGL library) that makes this worthwhile.
nothing because cocoa is justI thought OS X just shipped OpenGL, and a relatively old version at that, and shoved some thin APIs on top of it.
this.
-
EDIT: in the HackerNews comments, the guy who created this monstrosity admitted that switching to Linux/Windows servers would save 10-15%. I highly doubt the savings would be that low, but even if they were: 15% savings is WELL-WORTH a software port.
These new garbage can Macs have some crazy dual graphics cards, super-optimized for the likes of Pixar animators and 3D folk in general.
That said, 15% does sound like he's lowballing the difference. I think the guy is just a Mactard who wanted to play with a bunch of shiny Apple toys and convinced himself (and investors / superiors) that this makes sense financially.
-
I'm also having trouble imagining what would make one OpenGL produce "better quality" than another.
Kind of like how playing the exact same audio file off two different brands of hard drive can sound different.
(I know this has been posted before, credit to whoever posted it first)
-
You know if he priced this shit out back when the moron Bitcoin speculators were driving the price of server-quality GPUs waaay up above normal, maybe his numbers slightly make sense maybe.
Still tying their company to Apple is such a stupid idea; they'll need to port sooner or later, the sooner they start, the better-off they'll be.
-
If I had the money, I'd run a study proving solar weather has more of an effect on audio than these stupid things they make a big deal over. But I don't, so I'll just pretend I did...
-
Racking Mac Pros. Not the old rectangular model, the stupid dustbin ones.
Huh. I was just told that we do that here. For the mac build system.
(the only non-mac people here are us Windows devs)
-
For the mac build system.
The difference is that this is an application that logic can arrive at. The OP isn't.
-
I think the guy is just a Mactard who wanted to play with a bunch of shiny Apple toys and convinced himself (and investors / superiors) that this makes sense financially.
This.
-
just porting your fucking code to raw OpenGL and running it on Windows or Linux servers at 1/3rd the cost?
But that's not nearly cool enough.
There's a blogger I read, who has done a couple of professional games, and writes toy ones in his spare time, who just a couple of weeks ago quit using Macs and switched to Windows: "Enemies not showing up was simply due to an uninitialized alpha value in their shader – yet another thing that was just getting waved through on the Mac despite being completely busted. Man, I am not sorry to see the back of that platform. It’s amazing how dramatically my attitude towards Apple has changed in just a few weeks. As the line goes, I suddenly realized that I hate it, and I have hated it for a very long time."
The post where he said it was http://mayflystudio.tumblr.com/post/117653273827/alert-readers-may-notice-something-a-little "The quality of Apple’s software, both desktop and mobile, has been in a tailspin ever since Steve left us, and their developer support goes way past dreadful and well into abusive. I just didn’t want to accept it, so I put up with worse and worse OS revisions until I looked around and realized whoa, this water is getting really hot. Time to jump. And in all honesty? Windows is a relief. I feel like I just took off a straitjacket I didn’t know I was wearing.
Also, Visual Studio 2013 is enormously better than XCode."
-
Also, Visual Studio
20132005 is enormously better than XCode."FTFY (and 2013 is miles ahead of 2005)
-
That reminds me of a graphics artist person I know who spent $3100+1 on a retina iMac. Among other things, it has 32GB of RAM.
Their excuse? "Oh, well I inherited the money when my grandfather passed away."
I'm going to note that they do 2D art and not 3D rendering. 'cause Photoshop really needs 32GB of RAM!!!!!!11111
1$3100 is just what I know about from checking the price on the online Apple store after seeing the specs for the screen size and RAM. It could be higher if they chose any other custom options.
-
The quality of Apple software has been in decline for a long longer than just Jobs' death.
Their developer support has always been shit, especially for game developers.
But yes, I agree with him on all fronts. Except I haven't used XCode so I can't make a fair assessment. Is Apple's new proprietary language Swift a match for C#?
-
-
Is Apple's new proprietary language Swift a match for C#?
No idea. I don't think I've used a mac since like 2001.
-
It hasn't shown up on this site yet so that's a ... good ... sign?
-
It's better, because you can use emoji in the code!
Yes, really.
-
Not the first time something like this was done...
Those made sense though, if you were in an environment that needed a Mac to manage iOS devices. I think that issue has been solved now, but at one point it made sense. Buying overpriced dustbins does not.
-
Well, C# supports utf-16, so you can do all kinds of horrible things like that. I remember seeing some C# code once where the method names were in katakana.
-
Oh, I imagine you probably can do it.
The point is, someone actually wrote an article about the fact you can do that in Swift. Like it's something to get excited about.
-
Racking Mac Pros. Not the old rectangular model, the stupid dustbin ones.
What they are trying to do is basically this:
-
Well, C# supports utf-16, so you can do all kinds of horrible things like that.
The compiler rejects variable names with emoji; I imagine it'd so the same with class and method names too
-
[Export]
public class プレイヤヴィウーモデル : NotificationObjectCompiles just fine!
-
The compiler rejects variable names with EMOJI; I imagine it'd so the same with class and method names too
-
@Magus said:
Well, C# supports utf-16, so you can do all kinds of horrible things like that.
The compiler rejects variable names with emoji; I imagine it'd so the same with class and method names too
UTF-16 and emoji don't play well together. C# and Windows have always had issues with surrogate pairs.
-
That's good at least.
-
UTF-16 and emoji don't play well together. C# and Windows have always had issues with surrogate pairs.
I used an emoji from this set; no surrogate pairs involved
-
VB.NET at least has some name restrictions - you can't create a variable named
×
, for example.
-
This is the DUMBEST idea I've seen in a long time
After looking over their site, they have a track record of idiotic ideas. This is their alternative to racking thousands of Mac Minis. So...maybe it is a little better? At least now they are racking some substantial hardware? A Mac Mini is basically just laptop hardware in a different form factor.
But yeah, still pretty fucking stupid. This is what happens when you give fan boys VC funding.
-
I would love to see what Apple's doing (outside of the normal OpenGL library) that makes this worthwhile. I'm also having trouble imagining what would make one OpenGL produce "better quality" than another. It's the same fucking API running on the same fucking GPU.
I suspect this is a marketing gimmick to his customers, who probably also buy Monster Cables for their stereo system.
-
Technically it might be a bad idea, but I don't think it's not cost effective. Say the same setup was 30% lower with HP server + Linux. That amounts to what? US$4000? Hardware is cheap.
Porting whatever they're doing so it works on Linux, changing their work tools, fixing driver issues, etc. might be more expensive on man hours than this.
Also, this way they get some free publicity. Beowulf cluster? Meh!
-
Rendering on a Mac?
That's particularly ironic considering this little tidbit:
Roosendaal recently reported that Jens Verwiebe, Blender’s long-time OS X platform maintainer, had decided to “abandon OS X as a serious 3D/graphics development platform” citing “lack of quality GPU support”.
-
Technically it might be a bad idea, but I don't think it's not cost effective. Say the same setup was 30% lower with HP server + Linux. That amounts to what? US$4000? Hardware is cheap.
More like $4000 per box.
-
Come on, those Macs aren't that expensive.Dafuq! One of those Macs costs close to 9000€
-
Come on, those Macs aren't that expensive.
They're having some company CUSTOM-FABRICATE each one of those metal shelves for them, I bet that's $4000/each alone. (Of course there's only one per 4 servers but... shut up.)
-
In dollars, that must be over the stated number.
-
More like $4000 per box.
That was my thought too. Plus, you're not likely (I assume) to build just one rack of those.
-
Technically it might be a bad idea, but I don't think it's not cost effective. Say the same setup was 30% lower with HP server + Linux. That amounts to what? US$4000? Hardware is cheap.
Porting whatever they're doing so it works on Linux, changing their work tools, fixing driver issues, etc. might be more expensive on man hours than this.
Per 4U of rack space, and they have their own datacenter which you can see in the photos. So, we are talking about many multiples of the number, which will be way over $4K. They are spending at least $6K per Mac Pro and presumably fitting 4 of them per 4U of rack space. So, 1 Mac Pro per unit of rack space. Plus the cost of the enclosure.
Backblaze spends ~$600 per enclosure for their custom 4U cases (at least that is the last I heard). So each 4U is costing them ~$25,500. ~$6,375 per U.
You can buy a hell of a 1U server for $6,375.
Edit: You can buy a hell of a 1U server, with dual power supplies, out of band management and completely upgradeable. Also, it will not look like a fancy trash can, or an "air purifier" from The Sharper Image.
-
They're having some company CUSTOM-FABRICATE each one of those metal shelves for them, I bet that's $4000/each alone.
Maybe if they were doing one-offs, but in scale that number comes down massively.
Backblaze spends less than that, as they order a lot of them and they originated the design.
If they were only buying one or two, you would be right. They are outfitting a datacenter.
-
You can buy a hell of a 1U server for $6,375.
But will it have the graphics capability of the trash can Mac?
-
Does anyone know exactly which parts of the "OS X graphics framework" they are using?
At first I thought it was OpenGl, but then porting would be trivial, and nobody uses OpenGl to get "High quality output".
So to me it sounds like they are just using the standard Mac OS X graphics framework to renderer graphics to images, but that seems to dump, even for the daily WTF.
-
It hasn't shown up on this site yet so that's a ... good ... sign?
We discussed it last June:
-
-
If I had the money, I'd run a study proving solar weather has more of an effect on audio than these stupid things they make a big deal over. But I don't, so I'll just pretend I did...
Sounds like a Kickstarter...