What ever happened to Tandem and NonStop OS 2010
What ever happened to Tandem and NonStop OS 2014
I gave up a long time ago. Her persistent "I refuse to understand anything that's not TOPS-10" certainly drove me to...
And see all the examples of "One man's noise is another man's information".
Ref the discovery of the universe's background radiation, or the narrow avoidance of the US Navy scuttling priceless observations; where the meterologists and the astromomers were interested in what was hiding where the Navy saw noise.
What ever happened to Tandem and NonStop OS 2016
OTOH, we had a situation where the floating point context switch wasn't complete due to a poorly documented fpu register, leading to a case where a particular sequence of task switches and...
This is an important part of signal intelligence. "Framing" the signal to frequency, space, time etc. is the first part to be done. Often a lot of significant information can be gained from the mere framing part.
I have finally understood what the difference between these two mindsets are, and it is important. I am not satisfied with the naming though; but I struggle to come up with better names.
Let me recap my understanding.
What ever happened to Tandem and NonStop OS 2011
I don't care for it either. I used the nouns because it described the work (thus the thinking habits) of the people...
Both of these are involve intensly symbolic analysis of representations, but they digress in the abstraction layer, and the processes they study.
As I see it, the OS mindset does the analysis pretty close to reality; and is very focused on observing, manipulating and changing state that is closely coupled to physical reality. Modelling has lots of state representaions, and have a lot of emphasis on clean transitions, but less on typed and symbolic structures. E.g. It is less important to get the context switch done in a provable reliable manner than to have very clean structures to represent the two processes.
I would introduce a subtype, the network protocol mindset. It has the same tradeoffs, but the focus is to make states transition the physical world rather than handle the interactions per se.
What ever happened to Tandem and NonStop OS 2015
Greg Menke Oddly enough the compiler weenies probably get there first too... What do you think happens...
Beyond a certain level these fields turn a little towards art. The baseline is clearly reproducible on other systems based on solid theory, but the last bits of tuning is based on a close loop; Boyd's OODA loop is pretty close to the behaviour I see here.
What ever happened to Tandem and NonStop OS 2017
You have to care in a variety of ways if you intend to operate a internetworking protocol, or even a protocol...
These people thrive when they can tune protocols, transfers, throughput etc and hate meetings, papers without a lot of formulas, non-formal specs.
In academia they are pretty lost in CS, but thrive in some math and many hardware-electronics and physics fields.
The "networker" subtype handles specifications and somewhat abstract descriptions a lot better.
What ever happened to Tandem and NonStop OS 2012
Morten Reistad I don't think they are mindsets, I think they are skillsets. The naming is...
An example of outputs from these minds are the PXE boot process, for netbooting PC-based servers. It lets the server perform magic to the clients by sticking in code to be executed on the machines that ask to be booted, and all responsability for further success lies with that code.
The Compiler mindset has a lot more abstraction, and is about representations of thought and processes. I would intrduce the "Database mindset" as a sub-type.
They also observe, manipulate and change state, but do so at considerable semantic distance; and often use the language for expressing it as a significant part of the resolution process itself.
They tend to want all work anchored firmly into theory, and are prepared to formulate that theory and write the necessary papers as a part of the implementation. A project may also start with several days of reading time on relevant theory.
GML and the descendants SGML, HTML, XML etc are examples of work by these minds. Powerful, neat and clean, but it was not immediatly useful before tools like the WWW appeared. With these tools it becomes extremely powerful.
The database subset deals with data and processes as distinct enbreasties, and are a little uneasy with self-referential data; like sql statements written into sql databases. The "other" half of this mindset do such things all the time; as playing with the language is part of the solution.
HTTP-HTML is an excellent example of a creation by this mindset. The protocol is extremely lean, of is just glue on top of tcp. All the work is done through languages. New ones crop up all the time.
Here it is usually taken as an inducation of getting too far down into the wine bottle when the group suddenly discuss physical reality through existentialist terms.
Excellent example where a compiler thinker would not have a problem at all, just define a transition and let it have actions with persistent results if taken; just make sure the system as a whole remained stable.
The OS thinker would set up a timer and retransmit when it fired.
In this case the compiler thinker is the winner if the packet loss times retransmit time is much above zero.