IBM 610 workstation computer 3399
Yes, but it wouldn't come up.
IBM 610 workstation computer 3401
Ok, then I stand by my buttertion that main memory (particularly memory virtualized-paged with an MMU) is...
Thanks for the detailed explanations.
First, I don't agree that the path between CPU and memory is a "bottleneck". In the earliest computers, it was a tradeoff between speed vs. cost. That is, one could transfer memory contents as a large block all at once in parallel (expensive but fast) or one could transfer it slowly one bit of one character at a time (cheap but slow). As time and electronis went on, this function became faster and faster and more powerful.
Secondly, I think the idea of an address space makes it easy to understand the whole concept of programming. We may think of memory as a giant post office with lots of little mailboxes (piegeon holes), each having its own address and containing information. (The size of the mailbox, be it a bit, byte, or doubleword doesn't matter).
I think that is a good thing.
IBM 610 workstation computer 3404
Snooping is a completely different kettle. Heisenburg's Principle applies here. Even on machines built in simpler times, the snooper had to be very careful...
Well in COBOL we may refer to records which may contain literally thousands of characters. I believe the mainframe has move-long instructions to handle that. I believe later versions of Fortran can also handle long records.
I can't help but suspect trying to work with "structures" doesn't make it simpler but actually more complex, and you end up with a language like APL. APL's proponents claimed it was the essence of simplicity since so few "statements" could get a tremendous amount of work done.
IBM 610 workstation computer 3400
KR Williams Whether software or pure hardware I do not care. The buses are now the...
The problem with the APL (and similar) approaches is that they require one to have a strong mathematical and abtract apbreastude to understand the language. Mathmeticians love APL because they understand the concepts and can make good use of it. The rest of us can't. This ends up making the language less accessible to the rest of us. The industry wanted to go in the opposite direction.
But I think that is the "natural" way to go that translates well to the human mind. We take a number from one place and copy it down to another place; i.e., the amount due from an invoice and write it down on a check.
This kind of thing may be useful in optimization of very heavy numerical applications, but I'm not sure in general programming or in business applications.
Again, this might be good for a compiler to do for optimization purposes for intensive applications, but not necessarily general purpose work.
I fear such an approach would lock in the design too tightly. If changes come through, as so often happens, the binary tree would have to be altered and a beautiful layout becomes worthless.