![[personal profile]](https://www.dreamwidth.org/img/silk/identity/user.png)
[This Year's December Days Theme is Community, and all the forms that it takes. If you have some suggestions about what communities I'm part of (or that you think I'm part of) that would be worth a look, let me know in the comments.]
I didn't actually intend to follow this pathway. What I wanted to do was play games for hours upon end, exploring, leveling up, and otherwise gaining mastery over a plethora of games in different styles and genres, with no greater ambition than that.
The problem, of course, is that computer gaming in that era was intimately entwined with progression in computers and technology themselves, and therefore, to learn how to run and play games, you had to learn about the underlying computer components, and how it was all put together, and at least some amount about operating systems and the constraints they imposed, the ways to navigate a command line, the need for device drivers, and how file systems worked.
I learned more about computer systems by trying to play games on them than I would have from the other pursuits of technology that I was being encouraged to explore, including programming.
This is not a "Kids these days have it so easy!" kind of rant. Kids have it much harder these days in trying to learn about technology, systems, and the like, because many of their devices are sealed units with embedded systems on a chip that cannot be upgraded or swapped, running operating systems that have been locked down as much as possible so as to make it easier for someone else to control their devices. This is supposed to make the devices easier to use, and safer as well, by not giving the user much, of any, control over what purposes to put their devices to. An ecosystem mediated by applications that are available from one storefront is an exceedingly restrictive one, meant to enrich the corporation rather than offer options to the user.
Neither, however, do I want to pitch a child, or anyone else, into the deep end of a system with no help and a blithe assumption that either they will figure it out and swim, or they will sink and become "lusers" that I no longer have to care about or devote effort and time to. I suspect the era I grew up in was unmatched for the way that it provided natural scaffolding for learning abut computers and systems. (Even if I never have soldered a chip to a board in my life.) The leaps and bounds of processing power, memory, and graphics, along with new motherboards, peripherals, and protocols to assist, were matched by games taking advantage of all of those new tricks and tools to improve themselves.
So I learned how to upgrade, to swap components, to make sure my memory was compatible, to connect new peripherals, to manage memory with boot disks (and how to format and reformat those disks to be useful storage again after they had finished being boot disks), how to traverse the MS-DOS file and directory structure, how to split files up over multiple floppies, as well as the dangers of playing track 1 on a CD that was hybrid data and audio. How "Please insert Disk 2" during an installation gave way to "please insert disc 2" while playing a game, and how the disc changes usually meant a point of no return. The fact that Final Fantasy VII was four discs, when most of my games had been just one. (And how annoyed I was that after VII and VIII, I wouldn't see a non-MMO Final Fantasy game with a simultaneous PC release until XV.) I learned that a 3dfx card was a side-along rather than a replacement for a video card and how minimum requirements sometimes really did mean that.
I built several of my own computers, and have replaced or added at least one component with every desktop I have owned. And then I needed a suitable environment to do my graduate work with, so I added a second hard disk drive and learned how to use Linux on it, which, admittedly, I'm still learning how to use. I've had to work with beep codes, cryptic error messages, making sure all the components are seated correctly and secure, and more futzing around with bootloaders than I really ever wanted to. I cannibalized old iFruits to create some stronger iFruits that could run a newer version of OSX, got paid for it, and learned to curse whomever thought those designs would be easily user-serviceable. (It's Apple, they don't actually care.) When I got smartphones and tablets, I tried to put aftermarket operating systems on them. I've done the same to some game consoles. I have a smart home brain, and I've successfully rescued two Chromebooks from obsolescence and put Linux on them, too.
These days, though, most computers that come into existence are strong enough for most applications, and games are really starting to do the thing where they want bigger and better to prove technology demonstration capabilities, rather than because the gains in technology are doing anything that allows for improved methods of storytelling, more color, or other such. (Rather flippantly, games seem to be more interested in making sure that the additional processing, graphics, and storage requirements are about ensuring that Lara's or Kasumi's breasts jiggle more realistically, than in taking advantage of those capabilities to tell stories they couldn't tell before.)
So, yeah, it looks like I'm some kind of wizard of technology, and being a trained information professional also means that I can look up the procedures of how others accomplished a particular task and replicate them. Which only makes me look even more impressive. A bona fide computer nerd who can figure out what is going on and fix it. (Much of what I do at work is more "check to make sure it's plugged in and turned on" than anything else.) I, in turn, look at the people who actually do IT work, network maintenance, ops, and so forth, and think of them as the actual computer nerds, since they do it professionally and are trained to to it, and I am an amateur enthusiast at best. I would like more amateur enthusiasts, as well, and I'm still going to do what I can to help my users and coworkers navigate their tech issues. All of that amateur enthusiasm and problem-solving does give me an appreciation for people who have to do it professionally and on systems that will cause major issues of they can't be brought back up quickly and easily.
Professionally, I often describe a vital part of my job as "translating from human to machine and back again," which, I suppose, betrays how much understanding I have of machines and their software programs, since I'm usually acting in that intermediary layer between someone who has a task to accomplish and a machine that will help or hinder them in accomplishing that task, depending on how much they can tell the machine to do what they want. Most library workers have some amount of that skill, even if they're not formally trained in it, after they work in the library for long enough and are exposed to enough of the common scenarios, situations, and issues that crop up in technology usage in the library. The rest of it is having that search ability to find a tutorial, or to find the right terminology, and then find a tutorial, and then to translate it back to someone who is at the computer and will have to do the clicking and keyboarding to make it happen. (And then promptly forget what it is they did, because they only ever expect to do this once and no other time.) The highly generalist streak of the library worker, and the foundations taught in library classes deliberately as foundations that can be applied to a wide range of possible scenarios, gives me broad ability to work with things, even if it doesn't always produce a deep or specialized knowledge. As I mentioned in an earlier post in the series, I have approximate knowledge of many things and that includes computer stuff.
It is, I suppose, a matter of perspective. Things that I can do look like computer magic, but not to me, because it's something I know how to do and have done enough times that I can look for telltale signs of the issue, or I've dune the procedure enough times that it's routine. I might be able to follow along a little with what the professionals do, but the things they do are equally as arcane to me as they might be for the users that think I'm impressive. Often times because it looks like they're coming up with solutions from their own knowledge, or their own ingenuity, and juggling a bigger picture than one in my head, or figuring out what works best for everyone, even if that means nobody gets their optimal solution. I know my own interiority, and that so much of what seems impressive to someone else is lots of experience with those things, and the ability to use my search skills to find solutions that other people have written up. For those trained professionals, they probably do mostly the same things, but they have their own sources of quick and deep reference to work with as well as their own experiences that make problems that would stump me and overwhelm me with their urgency more manageable and more routine for them.
I certainly didn't intend to become one, but everything that I've done in life, each of those tasks that I wanted to do, each of those games I wanted to play, they've all contributed to my becoming a member of the computer nerds. (Yes, Hardison, I hear you. Age of the Geek, yes, but many of those geeks turned out to be people who didn't have strong morals, had their morals easily swayed by the promise of lucre, or are working somewhere that does something they find objectionable or at least uneasy-making because capitalism prevents us all from working our perfect jobs and instead forces us to work in places that can pay enough to satisfy capitalism.)
I didn't actually intend to follow this pathway. What I wanted to do was play games for hours upon end, exploring, leveling up, and otherwise gaining mastery over a plethora of games in different styles and genres, with no greater ambition than that.
The problem, of course, is that computer gaming in that era was intimately entwined with progression in computers and technology themselves, and therefore, to learn how to run and play games, you had to learn about the underlying computer components, and how it was all put together, and at least some amount about operating systems and the constraints they imposed, the ways to navigate a command line, the need for device drivers, and how file systems worked.
I learned more about computer systems by trying to play games on them than I would have from the other pursuits of technology that I was being encouraged to explore, including programming.
This is not a "Kids these days have it so easy!" kind of rant. Kids have it much harder these days in trying to learn about technology, systems, and the like, because many of their devices are sealed units with embedded systems on a chip that cannot be upgraded or swapped, running operating systems that have been locked down as much as possible so as to make it easier for someone else to control their devices. This is supposed to make the devices easier to use, and safer as well, by not giving the user much, of any, control over what purposes to put their devices to. An ecosystem mediated by applications that are available from one storefront is an exceedingly restrictive one, meant to enrich the corporation rather than offer options to the user.
Neither, however, do I want to pitch a child, or anyone else, into the deep end of a system with no help and a blithe assumption that either they will figure it out and swim, or they will sink and become "lusers" that I no longer have to care about or devote effort and time to. I suspect the era I grew up in was unmatched for the way that it provided natural scaffolding for learning abut computers and systems. (Even if I never have soldered a chip to a board in my life.) The leaps and bounds of processing power, memory, and graphics, along with new motherboards, peripherals, and protocols to assist, were matched by games taking advantage of all of those new tricks and tools to improve themselves.
So I learned how to upgrade, to swap components, to make sure my memory was compatible, to connect new peripherals, to manage memory with boot disks (and how to format and reformat those disks to be useful storage again after they had finished being boot disks), how to traverse the MS-DOS file and directory structure, how to split files up over multiple floppies, as well as the dangers of playing track 1 on a CD that was hybrid data and audio. How "Please insert Disk 2" during an installation gave way to "please insert disc 2" while playing a game, and how the disc changes usually meant a point of no return. The fact that Final Fantasy VII was four discs, when most of my games had been just one. (And how annoyed I was that after VII and VIII, I wouldn't see a non-MMO Final Fantasy game with a simultaneous PC release until XV.) I learned that a 3dfx card was a side-along rather than a replacement for a video card and how minimum requirements sometimes really did mean that.
I built several of my own computers, and have replaced or added at least one component with every desktop I have owned. And then I needed a suitable environment to do my graduate work with, so I added a second hard disk drive and learned how to use Linux on it, which, admittedly, I'm still learning how to use. I've had to work with beep codes, cryptic error messages, making sure all the components are seated correctly and secure, and more futzing around with bootloaders than I really ever wanted to. I cannibalized old iFruits to create some stronger iFruits that could run a newer version of OSX, got paid for it, and learned to curse whomever thought those designs would be easily user-serviceable. (It's Apple, they don't actually care.) When I got smartphones and tablets, I tried to put aftermarket operating systems on them. I've done the same to some game consoles. I have a smart home brain, and I've successfully rescued two Chromebooks from obsolescence and put Linux on them, too.
These days, though, most computers that come into existence are strong enough for most applications, and games are really starting to do the thing where they want bigger and better to prove technology demonstration capabilities, rather than because the gains in technology are doing anything that allows for improved methods of storytelling, more color, or other such. (Rather flippantly, games seem to be more interested in making sure that the additional processing, graphics, and storage requirements are about ensuring that Lara's or Kasumi's breasts jiggle more realistically, than in taking advantage of those capabilities to tell stories they couldn't tell before.)
So, yeah, it looks like I'm some kind of wizard of technology, and being a trained information professional also means that I can look up the procedures of how others accomplished a particular task and replicate them. Which only makes me look even more impressive. A bona fide computer nerd who can figure out what is going on and fix it. (Much of what I do at work is more "check to make sure it's plugged in and turned on" than anything else.) I, in turn, look at the people who actually do IT work, network maintenance, ops, and so forth, and think of them as the actual computer nerds, since they do it professionally and are trained to to it, and I am an amateur enthusiast at best. I would like more amateur enthusiasts, as well, and I'm still going to do what I can to help my users and coworkers navigate their tech issues. All of that amateur enthusiasm and problem-solving does give me an appreciation for people who have to do it professionally and on systems that will cause major issues of they can't be brought back up quickly and easily.
Professionally, I often describe a vital part of my job as "translating from human to machine and back again," which, I suppose, betrays how much understanding I have of machines and their software programs, since I'm usually acting in that intermediary layer between someone who has a task to accomplish and a machine that will help or hinder them in accomplishing that task, depending on how much they can tell the machine to do what they want. Most library workers have some amount of that skill, even if they're not formally trained in it, after they work in the library for long enough and are exposed to enough of the common scenarios, situations, and issues that crop up in technology usage in the library. The rest of it is having that search ability to find a tutorial, or to find the right terminology, and then find a tutorial, and then to translate it back to someone who is at the computer and will have to do the clicking and keyboarding to make it happen. (And then promptly forget what it is they did, because they only ever expect to do this once and no other time.) The highly generalist streak of the library worker, and the foundations taught in library classes deliberately as foundations that can be applied to a wide range of possible scenarios, gives me broad ability to work with things, even if it doesn't always produce a deep or specialized knowledge. As I mentioned in an earlier post in the series, I have approximate knowledge of many things and that includes computer stuff.
It is, I suppose, a matter of perspective. Things that I can do look like computer magic, but not to me, because it's something I know how to do and have done enough times that I can look for telltale signs of the issue, or I've dune the procedure enough times that it's routine. I might be able to follow along a little with what the professionals do, but the things they do are equally as arcane to me as they might be for the users that think I'm impressive. Often times because it looks like they're coming up with solutions from their own knowledge, or their own ingenuity, and juggling a bigger picture than one in my head, or figuring out what works best for everyone, even if that means nobody gets their optimal solution. I know my own interiority, and that so much of what seems impressive to someone else is lots of experience with those things, and the ability to use my search skills to find solutions that other people have written up. For those trained professionals, they probably do mostly the same things, but they have their own sources of quick and deep reference to work with as well as their own experiences that make problems that would stump me and overwhelm me with their urgency more manageable and more routine for them.
I certainly didn't intend to become one, but everything that I've done in life, each of those tasks that I wanted to do, each of those games I wanted to play, they've all contributed to my becoming a member of the computer nerds. (Yes, Hardison, I hear you. Age of the Geek, yes, but many of those geeks turned out to be people who didn't have strong morals, had their morals easily swayed by the promise of lucre, or are working somewhere that does something they find objectionable or at least uneasy-making because capitalism prevents us all from working our perfect jobs and instead forces us to work in places that can pay enough to satisfy capitalism.)
no subject
Date: 2024-12-17 06:46 am (UTC)I found computing because it was placed in front of me, and then it grew up around me from "neat toy" to "central to daily life". Lucky timing was a lot of it.
Fall 1979: a local organization that created enrichment opportunities for gifted kids outside of school was offering classes. One of them was basic BASIC computer programming, taught by a couple of programmers at Weyerhauser using their downtown Tacoma HQ as the venue in a room full of printing terminals with acoustic couplers to connect to their minicomputer. I was instantly hooked by the idea of writing a series of steps for the computer to perform.
Late 1982: my parents got me a computer (one of Mom's coworkers was upgrading his Apple II+ and getting a //e). I learned a lot about computers by having one I could open up and expand.
Mid-1983: I saved up and bought myself a 300bps modem so I could call bulletin board systems; my first taste of the online world.
1986: off to college, not majoring in computer science but rather business administration since I had the impression that comp sci was more...theoretical? Programming to get things done was Information Systems (now mostly known as IT) and that was actually more of a focus in the management school.
I was soon working at the computer center help desk, more for the access to different kinds of systems than for the money - where I was introduced to UNIX and the Internet.
By the time I graduated in 1990, I was glad I had a degree that would get me a Real Job; only universities and a few companies used this Internet thing, so as much as I liked using it such jobs were going to be hard to come by. Needless to say, that changed very shortly afterwards! I found myself in a great position, since I knew more about it than most; even if I didn't have specific coursework that covered it, nobody else did either!
So I wound up as the EFF's first full-time sysadmin.
Then I worked on the Human Genome Project.
Then I helped build a modern architecture airline reservation system, as opposed to the 1960s mainframe tech still in use today. (It launched successfully but was later cancelled, unfortunately.)
Now I work for a major tech company that didn't even exist when I first started getting paid for this stuff, let alone when I found computing.
no subject
Date: 2024-12-17 04:04 pm (UTC)no subject
Date: 2024-12-17 09:59 pm (UTC)Yeah. Having knowledge go from "neat thing I learned for its own sake" to "career opportunity" even once would have been amazing, but repeatedly?!
(I've been diligent about using my employer's full gift match every year in part because I feel ethically obligated to make my ridiculously good fortune something that isn't purely to my own benefit.)
no subject
Date: 2024-12-18 05:04 am (UTC)no subject
Date: 2024-12-18 05:20 am (UTC)