@gleemonk said in Tablet for making phone calls:
@ScholRLEA I mean the ones that were made before I was born. Those were good I assume. Everything after them was crazy zoo shit from my experience.
Not knowing how you old you are, I can only guess... but no, they were if anything worse. The only way that the IBM PC was better than the competition was in the sense that a) it was 16-bit before most of the others were (but a really half-assed 16-bit with an 8-bit external pinout, so...), and b) it had the letters I B M on the name plate, which was more familiar than names like 'Apple', 'Osborne', 'Kaypro', or 'Commodore' (though 'Tandy' had decent name recognition, too, as did Atari for any household with teenagers, and Apple was at least getting into the newspapers). The PC was actually a pretty mediocre small computer for it's time, but IBM had the most ferocious marketing department in the world, and were not at all averse to dirty deals. There's a reason why the old-school hackers hated Big Blue.
Now, mind you, it didn't exactly set the world on fire, either, with sales which would have been considered sluggish if IBM was thinking of personal computers as anything more than fancy terminals. The PC was designed by a rogue group within IBM who basically were told to 'do something' about these small home computers that were popping up, then given a shoestring budget and left to their own devices. That IBM marketed it at all just showed that they were aware of the threat small computers presented; that they weren't being more aggressive about it showed that they weren't taking the threat seriously enough.
But what really made the PC, ironically enough, was IBM losing control of the platform, and that in turn was due to them not being the quasi-Stalinists they usually behaved like when it came to the hardware peripherals and the operating system. Because they didn't put the budget into it to make the CPU and OS in house like they usually did, or craft the tightly-structured upgrade paths they so dearly loved, they ended up almost by accident making a superbly expandable system - and one with only a very minimal firmware. This last part was the real key, and had IBM realized it they would have thrown that key away.
You see, IBM was used to beating off copycats like Amdahl, and third-party add-on manufacturers like Shugart, over the head with their tightly controlled trade secrets. But for the PC, the only way to make it work was to give all the things like the BIOS documentation and the bus signals out to all and sundry, because they weren't going to spend the money it would take to do themselves... which meant that a newcomer like Compaq could sneak in with a API-compatible clean-room reproduction of the BIOS and make it stick in court.
That opened the floodgates when it came to hardware. Software? Digital Research and the UCSD Group both had been selling their OSes for other systems long before the 5150 was designed so there wasn't much they could do about that, but they must have been napping when they cut the deal with Microsoft because Gates and Allen insisted on the same open-ended terms for MS-DOS and got them. And unlike CP/M-86 and UCSD Pascal, they were willing to sell for volume, at about a third the price of the other two, not just to IBM but to Compaq, Digital Electronics Corp., and anyone else looking for a cheap, plain-vanilla 16-bit OS.
It was IBM that made the PC, but it was IBM's loss of the case against Compaq that made it a success.
But trust me, that success was despite the platform, not because of it. There were a lot of systems that were cheaper, and a lot of systems that were better, but the PC clone market hit the butter spot of both reasonable price and (barely) acceptable performance.
Of course, a lot of the early PC clones and workalikes were genuine stinkers, and the 'compatibility' was often very poor indeed. The AT&T 6300 comes to mind - it had a proprietary backplane rather than a motherboard, and used an 8086 rather than an 8088 (which could have made it better if there were any peripherals or software that worked with it), and while it was a decent design overall, it was barely a PC - it stank on ice when it came to running existing PC software, hell, most didn't run at all. The 'standard' was the IBM hardware, and while a lot of the details were published - at least until the PC/AT came out - a lot had to be reverse engineered or even guessed at, which made building 8-bit cards for the PC only slightly less infuriating that building them for the notorious S-100 bus, and anything that was 16-bit was a crapshoot.
Finally, IBM decided that things were out of control, and trying to rein the market back in with a new 16/32 bit bus, the Micro-Channel Architecture. The MCA was far superior, technically, but was so loaded down with patents that cloning it was out of the question. IBM made a token effort to license the bus design, but they deliberately priced the per-unit royalty so high that anyone who tried to sell a PS/2 compatible for less than the PS/2s were would lose their shirts.
This obvious attempt at trying to stomp their competition flat backfired so bad that it effectively took IBM out of the desktop market by 1992. It also led to something unheard of - a formal standard for the 16-bit PC bus, now called Industry Standard Architecture. The ISA standard existed mainly to give the designers of its intended 'open' successor, EISA, something to contrast with, but EISA itself proved to be far too expensive at a time when most PCs were still using the 8-bit XT bus, and by the time a true 32-bit bus was really needed (mainly for accelerated graphics - by then, main memory was on SIPPs or SIMMs slotted directly into the local bus rather than on ISA expansion cards, and most other peripherals never really needed it, even today), both MCA and EISA were out of date and the VESA Local Bus, Accelerated Graphics Port, and eventually, Peripheral Component Interconnect buses ate everyone else's lunch.