IBMWatson autobiographythoughts on 762
Anne & Lynn Wheeler
I wonder if compatibility was a big issue in the late 1950s.
IBM didn't come up with until the early 1960s Spread conference. I believe part of the motivation was internal--IBM realized it had to support a whole bunch of diverse platforms, including systems software, applications software, and peripherals for each platform. I don't think the other companies had as many models to be concerned about compatibility.
Indeed, even in IBM there was great dissent over compatibility. Haastra wanted to put out a super-1401 using SLT chips. Others were afraid of losing existing customers who wanted more while S-360 was developed and implemented. (Honeywell was pushing a "liberator" converter for 1401 customers.)
IBMWatson autobiographythoughts on 767
John R. Levine basically leases were somewhat like cellphone billing ... basic plan and possibly a lot for overages ... based on cpu meter. the meter ran while...
Not without an awful lot of screaming and fighting.
IBMWatson autobiographythoughts on 763
I always understood it more as an issue of the idea of a 'business' computer and a 'scientific' computer. i.e. the computers were so specialized for one purpose that they generally couldn't do other things...
We take compatibility for large-small-science-busienss for granted today, but the engineering for those four was quite different and very significant in the early 1960s. IBM went through an awful grief trying to accomplish compromises for the whole product line. They had trouble building high-end machines and the lowest--model 20--was not compatible.
They presumed a single operating system would handle everything and that was a disaster.
They still had a lot of trouble handing time-sharing and DAT-- stuff that had to be added later.
One night when I had exclusive use of our mainframe I tried some benchmark experiments with repebreastive calculations in various arithmetic types. I found using the right mode (binary or packed-decimal) made a big performance difference. Binary was very fast, but only if you left it alone and didn't need frequent converts back and fourth, otherwise packed-d was better. (I don't know about today's mainframes and ours is so busy all the time it's harder to run such a test; plus today's machines are incredibly fast. I still don't know why numeric-intensive researchers don't use plain IBM mainframes nowadays, esp since the instruction set is expanded to handle very long floating point precision and built-in math functions.)
I am impressed that the original designers put both methods in the orginal architecture; both word and character handling. I am also impressed with the address handling so that small machines didn't waste too much space on unnecessarily long address fields.
IBMWatson autobiographythoughts on 765
The computer might do the job X times faster than the sorter, but if a computer minute cost more than X sorter minutes...
Undoubtedly emulation kept a lot of existing customers in the fold. But I think the IBM 'hand-holding' helped attract many new customers.
IBMWatson autobiographythoughts on 764
Philip Nasadowski Yes, that was true. Certainly any applications could run. But the CPUs were far slower in those days and performance suffered when the wrong application type was run. Remember that early PCs did...
I've worked with people from some other major makers and IBMers were in a clbutt by themselves. They were virtually all incredibly well trained, extremely polished, and thoroughly prepared.
I must admit I liked the compebreastor's employees as people--they seemed more natural and down-to-earth while IBMers were ever-so- proper. Univac in Blue Bell PA was a world of difference than IBM offices. But we don't buy computers based on whose reps are better to pal around with, we buy them to get work done in the most efficient method possible.