SendOfJive
Hehe and if ya think D&B was primitive, during my senior year in high school there was exactly one computer in the entire building. Our IBM 1620 was a shade larger than a large rolltop desk with a front panel decked out in matrices of small incandescent lamps (cheap LEDs weren't available then). Check that Wiki page and see why some peeps called it the CADET. Now for the good stuff:
No hard drive. When you turned it off it forgot everything. After turning it on in the morning it took maybe 15 minutes to warm up. That time could be shortened if you know a neat little trick. Open the panel on the right side of the machine to expose a large circuit board with relays protected by a plexiglass panel. Push the panel down over a certain relay to close its contacts, and voila! Time to compute.
But how?
Input devices: card reader, typewriter keyboard, four toggle switches that the software could interrogate
Hardcopy output device: card punch, typewriter.
Video device: continuous-form paper printed upon by the typewriter.
RAM: 20K 1/2-bytes (10K bytes) of magnetic core memory. Yes, K as in 1,000.
No mouse. No sound. No networking. Fat power cable to drive relays, discrete transistors, etc.
A math teacher who'd learned FORTRAN and taught my class.
LOL. All you could do was run FORTRAN programs (no debugger), assembly programs, and machine-language programs, period. Punch up a deck of 80-column cards with your FORTRAN program on a Model 29 keypunch. Be careful though because if you make a mistake you can't "un-punch" it so you have to toss the card and try again. Don't drop the deck, either because it's a pain re-ordering it. Reset the machine and load the compiler - a deck of cards about three inches thick. Plop the FORTRAN deck in the card reader and pray the compiler likes it. If not, fix the bad card(s) and retry. Then your program runs. If there's a run-time error the compiler will gladly tell you by printing the word ERROR followed by a number on the console. Then you had to look it up in a book to see what it meant. ERROR 63 meant float-point overflow; 10^46 was about as high the compiler would accommodate. Normally the compiler would accept multiple programs, but sometimes it got confused and had to be reloaded from scratch.
I took to this baby like a fish takes to water and was ultimately writing programs in machine language. Each instruction was 12 digits long: 2 digits for the opcode, 5 for the first operand and another 5 for the second. Kind of a pain because if you wanted to insert an instruction you had to go back and fix every other instruction that branched beyond the new one.
The amount of computing power IBM developers squeezed out of that machine was simply amazing. But at the time IBM was rolling cash and could afford to hire the very best talent.
After high school I studied for 4 years to earn a B.S. Our school had a Xerox mainframe and that's where I learned how to program its operating system, which made me a shoo-in at D&B. Very few code in assembly language anymore. Good C compilers showed up during the 1980s making it feasible to develop an OS in a sorta hi-level language. C++ was a pig in a tutu; I never learned it and went straight from C to Java when it came out.
One can only wonder how programmers got anything out of ENIAC. No transistors, just a boatload of vacuum tubes, a mammoth power sink, a ventilation problem, and a repairman's nightmare!