Anybody know where I can find a breakdown of computer statistics?
-
Namely, I'm very interested in the number of people using 32 bit machines vs 64 bit machines, preferably by country.
I strongly suspect 2/3 of the computing world is still running 32 bit computers regularly. (Or at least, places like China, and Spain)
Also, if anybody know if Microsoft or the other big brands are planning on fully converting to 64 bit machines in the future would be appreciated.
My google fu is weak today.
-
19.18% of respondents are on 32-bit operating systems.
-
When you said "breakdown of computer statistics," I immediately thought of forum discussions about the MHP.
-
Steam isn't a terrible sample, but it's not completely representative - it's representative of 1) people that use Steam, and 2) people that have submitted their information.
But given that 64-bit machines have been around for a while - my last frankenbox had a motherboard from 2007 in it, and that was 64-bit.
-
And even then, the OP's question wasn't exactly worded well. He asked:
Namely, I'm very interested in the number of people using 32 bit machines vs 64 bit machines
The thing is, most (if not all) desktops and laptops made in the past 5+ years have 64-bit capable hardware. The ones sold as 32-bit just have a 32-bit OS installed on top of the 64-bit hardware. The OP should be asking:
What percentage of people are using 32-bit OSs vs 64-bit OSs?
If you want to expand beyond desktops and laptops, you'll have to find someone else with more information on the subject.
-
That is a better qualifier. Specifically, I'm looking for statistics around how many desktops/laptops support a build against x64 system from c# visual studio. (very specific, but you know.)
The reason I'm limiting to desktop/laptops, is that's the platform i'm interested in targeting, since mobile and related have such a wide diversification of 32, 64, arm6, arm7 etc.
I know it's been around a long time, and i've had a 64 bit operating system for the last 7 years or so, but I'm usually ahead of the curve fairly significantly on new toys, and was wondering about the rest of the world.
-
The thing is, most (if not all) desktops and laptops made in the past 5+ years have 64-bit capable hardware.
Some netbooks and tablets used 32-bit only Atoms not that long ago, though you probably don't need to care about those.I also saw some recent low-end laptops/tablets with 64-bit capable CPU, but 32-bit UEFI, which basically prevents the machine from booting in 64-bit mode (should be doable through BIOS emulation, but IIRC these things did not have a CSM, so no BIOS-boot).
-
That is a better qualifier. Specifically, I'm looking for statistics around how many desktops/laptops support a build against x64 system from c# visual studio. (very specific, but you know.)
Is cross-compilation a thing under C#?
-
Is cross-compilation a thing under C#?
With .NET in Visual Studio, cross compilation, or forcing a specific CPU build is simple. I haven't tried making a 64 bit build from a 32 bit machine, but no reason to doubt it works, considering a .NET executable still has to be compiled down to machine code when run the first time.
I guess really, the question is confusing. Is there a specific need to target x64 only @Matches, and not just AnyCPU?
-
I guess really, the question is confusing. Is there a specific need to target x64 only @Matches, and not just AnyCPU?
Calling native libraries?
-
I generally see a case for 32 bit when that's needed (using COM). I'm sure there are some 64 bit only libraries (I don't know of any off hand), but why I'm asking is to see if that's the case, or if it's something else, or if it's even needed at all.
-
Certain 3rd party libraries for socket choose that are largely all native. Newer versions support x32/x64, but mono 2.6 doesn't support the new versions. (I mentioned in another thread about some hackery I've employed to get around that)
Essentially I'm debating worth between pursuing the hack, dropping 32bit support, or waiting for a unity 3d framework upgrade.
-
In that case, best information I can find is that 4 years ago, roughly 46% of Win7 installs were 64-bit. I would assume the number has only gone up, and since you're talking Unity, I'm going to believe your target generally will be on 64-bit platforms (unless you're aiming at mobile as well).
-
Mobile isn't possible due to various aspects, so that's an acceptable loss.
While i agree it likely has only gone up, years outdated data is hard to swallow.
Do we have a census schedule for the intertubes?
-
I guess the other question I have is, what do the socket libraries do specifically that you want to avoid redoing?
-
ZeroMQ libraries, newest stable for the server, 3.5 compatible for unity. Right now I'm using my hackery to move jsonified base64 data from the app to a custom launcher, which forwards it to the connected server. (Newest stable from the launcher & on the server)
Benefits of the hackery: Same version between client (local server manager) - and server side
Downside: All items have to be sent through the named pipe before they can go out into the wildBenefits of direct use: Known communication path between the two versions, no named pipe client requirement
Downside: A bit painful to use the older version, but not enough to make a significant difference.Both should be easily switched out due to abstractions and interface contracts.
-
XMBC has also a breakdown from the platforms where their software is installed:
-
Looks to mostly agree with steams report then, from a percentages perspective
-
Some netbooks and tablets used 32-bit only Atoms not that long ago, though you probably don't need to care about those.
Yeah, they don't count.I also saw some recent low-end laptops/tablets with 64-bit capable CPU, but 32-bit UEFI, which basically prevents the machine from booting in 64-bit mode (should be doable through BIOS emulation, but IIRC these things did not have a CSM, so no BIOS-boot).
And that's why I said most.
-
Is cross-compilation a thing under C#?
Yes.
-
I haven't tried making a 64 bit build from a 32 bit machine, but no reason to doubt it works
Works, and I can confirm it, because my company is too cheap to give me a decent dev machine. When not running over RDP, it's a bit of a pain in the ass to always remember to build a 64-bit version before deploying it.
-
Question from the peanut gallery: what does it matter if you're 64-bit only or not?
Unless you're doing a serious amount of data crunching, I can't see why it should matter.
-
A lot of games are 32-bit only and that means I can't load maps the size of the galaxy which upsets me.
-
Loading multi-GB data blocks is surely a WTF in itself.
-
Well, when you're limited to 2-3GB for all the models, textures, sounds, shaders, executable code, scripts, and maps, you tend to run out of space quite quickly.
-
And there's no way to optimise any of it?
-
There is, using large amounts of techniques from stream loading to culling, but when you're talking about memory intensive 3d games sometimes it's still just not enough.
I haven't surpassed the 32 bit limit yet, but I don't really want to build for a 32 bit platform since it's on it's way out for my desired platform.
But I'm looking at the next 3-5 years, not this year - I'm trying to judge what the market will look like then.
-
That sounds like a more reasonable answer than 'because I have a lot of shit to load' ;)
If you're building for the next 3-5 years, you can pretty much assume 64 bit will be the only thing going by then.
-
32-bit only and that means I can't load maps the size of the galaxy
That just depends on how one of your pixels maps to reality.
-
I generally assume a human takes at least one pixel in a game.
-
That's a pretty big human in relation to the galaxy.
-
I never said the whole map fit on the screen at once!
-
So then you can load maps the size of the galaxy on 32 bit. Use proper culling.
-
That wouldn't be a map, then. That'd be something like Half-Life where the world is split up into a bunch of tiny maps that just have very similar parts at the edges.
-
So it's a chunked map then.
Also, Frontier had an entire galaxy in a few hundred KB.
-
That wouldn't be a map, then. That'd be something like Half-Life where the world is split up into a bunch of tiny maps that just have very similar parts at the edges.
What's undesirable about this approach?
-
I wanted to load the entire galaxy at once without waiting for disk. I can't do that on 32-bit.
-
Assuming 1x1m for a person, you can't fit the galaxy in 2^64 memory addresses either. The galaxy in a 1-bit square image takes up 9.460.528.400.000.000.000.000 pixels, while 2^64 = 18.446.744.073.709.551.616 which is too small by a factor of about 450.
But I'm sure you have either a workaround, or a proof that my brain is full of fuck. I concede that I may mave miscounted a 0, because that's really easy to mess up.
-
Well, most of the galaxy is relatively empty, so you could probably save a lot of space by using something sparser than a raster image.
-
Well, if you're compressing it and storing it, but it's not compressed when it's in memory, and if you track changes, you still need to be able to address every single pixel in that square.
You could drop all the void outside the galaxy, I guess, and get a 30% bonus, but that would require some extra weird mapping shenanigans that I don't even want to think about.
-
And even then all you have a ridiculous 2D galaxy.
Filed Under: KHAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAN
-
-
Assuming 1x1m for a person
You have quite an interesting body build.
Well, if you're compressing it and storing it, but it's not compressed when it's in memory, and if you track changes, you still need to be able to address every single pixel in that square.
Technically, you could store the data on HDD - NTFS supports 2^64-1 clusters, and I thing the addressing would be possible with some software-simulated 128-bit values. I am, however, fairly certain that buying out all the HDDs in existence and hooking them up into one RAID array would prove problematic.
Filed under: it might be easier to create a galaxy and write an API for it
-
It speaks to the scale of the cosmos that it currently utterly outdoes not only our transportation methods, but our simulation capacity as well.
(well, if you want meter resolution, that is)
-
Isn't this what mongodb is for?
-
Nah; MongoDB is only web-scale, not galaxy-scale.
-
But imagine the blakeyrants if he tried to go galaxy scale.