Change in computers as a hobbiest... 2863
Mount a tape 2867
my first programming job was to implement a 360-30 version of 1401 mpio (rather than running 360-30 in 1401 emulation mode...
Then I have no choice but to strongly disagree with your disagreement.
The difference lies not so much in the ability to tinker but in the motivation that drives the desire to tinker. Back in the 80s, home computers (the 8-bit breed of them, with processors like the MOS 6502 or Zilog Z80) were an empty canvas. Most of the applications and concepts we're now taking for granted were not invented yet, or at least not implemented on a lowly home computer hardware. People who had computers at home were almost exclusively hobbyists - not using the computer for productive work but experimenting with new concepts and ideas.
It had not yet been generally accepted that personal home computers could be seriously used for productive work - and even if there were some people that nurtured this possibility, there were only vague ideas about what kind of "productive work" that could be. (Cataloguing your freezer? Nah, too much manual work. Word processing? Nah, the tv set is a too low-resolution device for real word processors. Creating music? Nah, you can only get some tinny-sounding beeps out of the box.)
Change in computers as a hobbiest... 2864
Alex But I don't have to do it myself, I can pay someone to do it for me. Now if my hobby was tinkering with my engine, I would...
Take the Commodore 64, for example. Every step that came along the way was a source of awe and delight (you can do THIS on a computer?!), and a major step towards what we have now. You can actually do real-time 3D vector graphics on your 8-bit home computer? Who would have thought of that when you purchased the thing? Hey, someone has figured out how to play digitized sound samples with a sound chip that doesn't really have that capability. What? They're changing the graphics modes on the fly, in the middle of screen. What's this? A whole mouse-controlled GUI desktop for the Commodore 64, with proportional fonts, word processor, a paint program and all? Huh? Undocumented opcodes? A raster trick for unlimited sprites? Code for opening the borders and filling them with graphics? MIDI interfaces? A scanner head for your dot matrix printer? Desktop publishing on a home computer?
Then came the Amiga. Mulbreastasking, 4-channel digital streo sound, a GUI in color, video digitizers, raytracers, genlocks, paint programs with 4096 colors, etc.
The canvas was empty when the first machines rolled out of the factory lines, but the hobbyists painted it full of these wonderful things - stretching the limits considerably beyond what the original hardware designers ever could have imagined, and literally creating the concept of a multimedia home computer with which you can do useful things, instead of just playing a Pong game. The idea of a computer at home formed during the 80s. Back then, you, I, everyone could literally create new industries by inventing and implementing new, then-unbelievable applications for computers at home. It was innovation fireworks in action, literally opening-up-new-horizons kind of stuff.
Not so now. Small innovations still happen, of course, but mainly it is all just rehashing the old concepts and representing them with new eye-candy. The industry is centered around making the processors and I-O buses faster and giving the computers more memory and mbutt storage capacity, but the truly ground-breaking stuff - as far as creating the concept of having a multimedia GUI computer at home and figuring out what to do with it goes - was already done during the 80s. What's there to discover now?
Examples would help in getting a better idea what you're referring to, specifically. Of course there's lots of faster-and-with-more-capacity stuff coming out every day, but it's just more of the same, faster.
I can really think of only one relatively new hardware innovation-application that I have found intriguing during its recent proliferation: WLAN. (Technology that allows linking two computers via radio was unavailable for a common hobbyist or a home computer user back in the 80s, UNLESS you were a radio amateur as well.)
Vintage Computer Festival East 3.0 update
Okay, Evan here again. I apologize for the incomplete original message. I was so excited...
When I bought my first computer, it had nothing but a BASIC interpreter and a few commercially available games. When I finished using that computer almost six years later, all the above-mentioned things were invented and implemented for it - mostly in the bedroom of someone like-minded hobbyist - and brought to the fingertips of all the users of the same platform. Tinkering in those days was rewarded with an unexpected ability to do something totally new; bleeding-edge stuff that no-one thought possible.
Tinkering with the current breed of computers - well, good look in trying to create something as impressive as the first computer program that displays real-time 3D vector graphics on a home computer, or the first computer program that plays back digitized music on affordable hardware which doesn't even have that functionality, officially.
The computers of the 80s had - for the most part - standardized video, audio, and other I-O hardware. You could bang the hardware directly, be the undisputed king of your machine, and expect your program to work in your friend's (identical) computer as well. More to the point, there was documentation about how to access these things.
Not so today. Even if disregarding the fact that every computer these days is different and no-one has just the same standardized setup as you, you can't even get register-level documentation about the current video and sound chips. The manufacturers keep them for themselves, for compebreastive reasons. (Just try asking nVidia if they would release register-level documentation about their 3D accelerators. They won't.)
2006, THE YEAR OF THE ZX81
Bonjour, 2006 IS THE 25th YEAR OF THE ZX81 for all the ZX81 users and Timex...
Basically, if you're not in the mood of creating a whole operating system of your own, you're stuck with accessing everything through layers and layers of code someone else wrote. Considering how complex and diverse the current machines can be, that is not necessarily bad, but you're not able to explore and control your machine anywhere near the same way as you could in the 80s. Now you'll mostly have to rely on others and their code, and hope that they didn't make too restrictive (or outright bad) decisions when designing the interfaces or implementing the drivers (since you can't usually work around the problem if they did.)