I don’t mean BETTER. That’s a different conversation. I mean cooler.

An old CRT display was literally a small scale particle accelerator, firing angry electron beams at light speed towards the viewers, bent by an electromagnet that alternates at an ultra high frequency, stopped by a rounded rectangle of glowing phosphors.

If a CRT goes bad it can actually make people sick.

That’s just. Conceptually a lot COOLER than a modern LED panel, which really is just a bajillion very tiny lightbulbs.

  • ikidd@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 month ago

    The internet?

    Web 1.0 and even before was way cooler than this corpo bullshit web we have now.

  • WoodScientist@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    1 month ago

    Any mechanical regulation process that used to be handled by actual machine parts. Think of the centrifugal governor, this beautiful and elegant mechanical device just for regulating the speed of a steam engine. Sure, a computer chip could do it a lot better today, and we’re not even building steam engines quite like those anymore. But still, mechanically controlled things are just genuinely a lot cooler.

    Or hell, even for computing, take a look at the elaborate mechanical computers that were used to calculate firing solutions on old battleships. Again, silicon computers perform objectively better in nearly every way, but there’s something objectively cool about solving an set of equations on an elaborate arrangement of clockwork.

    • Arghblarg@lemmy.ca
      link
      fedilink
      arrow-up
      0
      ·
      edit-2
      1 month ago

      Someone showed me a record turntable with what must have been a centrifugal governor! What an ingenious device. (I got the impression from him this was unusual for a turntable, at least…)

      • Count Regal Inkwell@pawb.socialOP
        link
        fedilink
        arrow-up
        0
        ·
        1 month ago

        I was under the impression that all wind-up turntables (I.e.: from the shellac records and steel needles and mechanical reproducers era) were using mechanical governors

        Maybe I’m wrong though.

    • traches@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 month ago

      H model C-130s, the ones with the 4 square blade props? The engines and props are mechanically governed. There are electronic corrections applied, but the core of the systems are purely mechanical. Still flying.

      Source: former flight engineer on them.

    • dmention7@lemm.ee
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 month ago

      To add, there is something about those old 40s and 50s era technical films like you linked that is just so… I don’t what exactly it is, but I find them fascinating and genuinely informative, even though they are explaining tech that is decades obsolete.

      It’s pretty awesome that they are still available 70+ years later in excellent quality!

    • Fondots@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      1 month ago

      Centrifugal governors are possibly one of the origins of the phrase “balls out” or “balls to the wall” (although many say “balls to the wall” has to do with the ball-shaped handles on old aircraft throttle levers)

      Also somewhat similar to governors are centrifugal switches, which are used in just about anything with an electric motor to disconnect the motor from a capacitor which gives the motor a little extra juice to get it going (I like this video for an explanation of how they work)

      • WoodScientist@lemmy.world
        link
        fedilink
        arrow-up
        0
        ·
        1 month ago

        I didn’t know that was a thing. Thanks! I’m honestly surprised some MBA bean counter hasn’t replaced those with a chip of some sort by now. Really cool!

      • Nibodhika@lemmy.world
        link
        fedilink
        arrow-up
        0
        ·
        1 month ago

        He’s not talking about punch card programming, that’s way more advanced and requires a Turing machine, what he’s talking about is computers as the term was using before what you would think as a computer existed.

        The example in the video is for the computer on a cannon in a battleship. If there wasn’t a computer you would need to adjust the angle and height of the cannon, but that’s not something a human can know, what humans can know is angle to the ship and the distance to it, so instead you put two inputs where a human inputs that and you translate that into angle/height. Now those two would be very straightforward, essentially you just rename the height crank to distance. But this computer is a lot more complex, because wind, speed, etc can affect the shoot, so you have cranks for all of that, and internally they combine into a final output of angle/height to the cannon.

  • over_clox@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    1 month ago

    Interchangeable automotive/bicycle parts.

    Or for that matter, interchangeable anything parts.

    Both cooler and better at the same time. Interchangeable parts made it easier to both customize and repair your own stuff…

    • Albbi@lemmy.ca
      link
      fedilink
      arrow-up
      0
      ·
      1 month ago

      I love that Replaceable Parts is a technology you can research in Civilization. The first time I saw it I thought it was kinda stupid until I thought “Oh wait, does that mean that there was a time when replacement parts just wasn’t a thing?”

      • over_clox@lemmy.world
        link
        fedilink
        arrow-up
        0
        ·
        edit-2
        1 month ago

        Used to be where Mongoose, Huffy, Schwinn, etc bearings and stuff were interchangeable. Used to be where NVidia GPUs could run in an AMD motherboard. I happen to own older things on both ends of that compatible spectrum.

        Used to be where an Idle Air Control Valve from a Chevy would fit an Isuzu…

        • Davel23@fedia.io
          link
          fedilink
          arrow-up
          0
          ·
          1 month ago

          Used to be where NVidia GPUs could run in an AMD motherboard.

          They still can.

          • over_clox@lemmy.world
            link
            fedilink
            arrow-up
            0
            ·
            1 month ago

            Oof, wait. I mean when AMD processors were actually compatible with nVidia motherboards.

            A8N-SLI Deluxe

            • autriyo@feddit.org
              link
              fedilink
              arrow-up
              0
              ·
              1 month ago

              But that’s not a thing for intel CPUs either, at least not anymore.

              I’m not sure why, but Nvidia hasn’t been making chipsets/motherboard sfor quite a while. Or was there a point in time when it only made chipsets for intel CPUs?

        • Nibodhika@lemmy.world
          link
          fedilink
          arrow-up
          0
          ·
          1 month ago

          Used to be where NVidia GPUs could run in an AMD motherboard. I happen to own older things on both ends of that compatible spectrum.

          I don’t know what you mean by that. The protocol for communication of computer parts is open source. Desktop computers are a great example of interchangeable parts. An Nvidia GPU that can’t run in an AMD motherboard is either not from the same era (so an equivalent AMD GPU wouldn’t work either) or a different form factor (e.g. trying to plug a laptop GPU on a Desktop)

          • over_clox@lemmy.world
            link
            fedilink
            arrow-up
            0
            ·
            1 month ago

            The protocol of communication of computer parts is open source? Since when?

            What the fuck is USB? And why is that proprietary?

            Regardless, AMD vs nVidia might work together, but not optimally these days.

            • Nibodhika@lemmy.world
              link
              fedilink
              arrow-up
              0
              ·
              1 month ago

              The protocol of communication of computer parts is open source? Since when?

              Since forever, which protocol do you think it’s not? For a few examples here’s PCI and DDR5

              What the fuck is USB? And why is that proprietary?

              USB is a standardized connector, with again an open source protocol. Here’s the specification in case you’re interested https://www.usb.org/document-library/usb-20-specification

              Regardless, AMD vs nVidia might work together, but not optimally these days.

              I would need a source for that, I’ve had AMD +Nvidia up until very recently and it worked as expected.

              • over_clox@lemmy.world
                link
                fedilink
                arrow-up
                0
                ·
                1 month ago

                USB is absolutely not a standardized connector, otherwise it would only be one type of connector, not the dozen or so they’ve made over the decades. There’s nothing universal about it.

                And if it was open source, then why doesn’t VirtualBox release the source code for their USB extension package?

                • Nibodhika@lemmy.world
                  link
                  fedilink
                  arrow-up
                  0
                  ·
                  1 month ago

                  USB is absolutely not a standardized connector,

                  USB is absolutely standardized, I even sent you the 2.0 spec, you can get the spec for the other versions on the same website.

                  otherwise it would only be one type of connector, not the dozen or so they’ve made over the decades.

                  Different versions/connectors have different specs, all of them open, otherwise different manufacturers wouldn’t be able to create devices that use it.

                  There’s nothing universal about it.

                  That’s ridiculous, first of all the name relates to the fact that it can be used for any data transfer as long as it’s serial. Secondly the sheer amount of different devices from different manufacturers that can be plugged via USB should give you a hint of just how universal and open the standard is.

                  And if it was open source, then why doesn’t VirtualBox release the source code for their USB extension package?

                  The standard is open, implementations of it are not, it’s like OpenGL or Vulkan.

            • azuth@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              0
              ·
              1 month ago

              Regardless, AMD vs nVidia might work together, but not optimally these days.

              And yet most of the time in the past 2 year the best choice for a gaming PC would be a 3D cache Ryzen with an Nvidia GPU. Is there something particular you have in mind that supposedly doesn’t work with an AMD chipset and an Nvidia GPU?

              PCI-Express is not an open standard but both AMD and Nvidia are members and it’s what both use for their GPUs and AMD for it’s chipsets (as well as Intel). It’s certainly not a secret cabal.

      • CommissarVulpin@lemmy.world
        link
        fedilink
        arrow-up
        0
        ·
        1 month ago

        The concept of having interchangeable, standardized parts is actually kind of a new idea from the Industrial Revolution. Before then, everything was custom-made to fit. The example that comes to mind is firearms. All of the muskets and rifles used in the revolutionary war, for example, were hand-made and hand-fitted. The lock from one rifle wouldn’t necessarily fit on another. If your stock broke, you couldn’t just go get a new stock and slap it on - you had to bust out the woodworking tools and make a new one.

  • terraborra@lemmy.nz
    link
    fedilink
    arrow-up
    0
    ·
    1 month ago

    Railway signalling and interlocking systems. Sure ETCS and other digital systems are far safer, but some dude at a junction used to manually reset the points and crossovers using a giant lever. Now everything’s just a digital system overseen by someone with 8+ monitors in a control room removed from the actual network.

    Also, not a technology, but rally cars used to be fully unhinged. I could watch old Group B videos for hours and never get bored.

  • Pyflixia@kbin.melroy.org
    link
    fedilink
    arrow-up
    0
    ·
    1 month ago

    Cell phones, when they had personality. The 2000s was such a good time for them, you had so many designs. Slide out keyboard, panels that can slide, sleek designs, some had actual buttons .etc

    But we’re now relegated to just a varying series of rectangles and squares. Yay…

    • AwkwardLookMonkeyPuppet@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      1 month ago

      I got one that slid up to reveal the keyboard after watching The Matrix, and I thought it was the coolest phone ever. I still have it, and I still think it is pretty cool.

      Edit: it’s not an actual keyboard though, it’s a phone keypad for dialing or sending texts with t9 input.

      It was the Samsung A737.

      It looks like this closed

      And here it is open

      • morbidcactus@lemmy.ca
        link
        fedilink
        arrow-up
        0
        ·
        1 month ago

        I still have my Sony Eriksson W580i, also thought it was the coolest thing.

        Still works and holds a charge, pulled photos off of the memory card recently, cameras have gotten a lot better… Had the red one, have had some very brightly colored phones, my favourite being the bright yellow Nokia Lumia 1020


        Had an amazing camera on it.

        • apprehensively_human@lemmy.ca
          link
          fedilink
          English
          arrow-up
          0
          ·
          30 days ago

          I think my favourite thing about those old Lumias was the dedicated camera shutter. The 1020 even had a battery grip case so you could hold it more like a digital camera.

        • AwkwardLookMonkeyPuppet@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          1 month ago

          Those are cool phones too, especially the red one! Yeah, the old digital cameras used to be junk. I have some old digital photos that look like they were taken on a potato.

  • Brkdncr@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    1 month ago

    There was a virus back in the day that could take advantage of old monitors. It would move a turkey around the screen and if you looked at it too long it would cause eye damage.

  • Mac@mander.xyz
    link
    fedilink
    arrow-up
    0
    ·
    edit-2
    1 month ago

    Carburetors are pretty fuckin cool.
    The concept seems simple: utilize the vacuum from the engine to pull in fuel. But they’re extremely complicated with all the tiny orifices and passageways to perfect the amount of fuel going into the engine at different points.

    Unrelated sidenote: i got deja vu writing this comment. Interesting.

  • Paradachshund@lemmy.today
    link
    fedilink
    arrow-up
    0
    ·
    1 month ago

    A lot of older tech had a way more interesting silhouette. You can see this clearly in how many objects live on in icon form. We still often use handset phones, magnifying glasses, gears, or the infamous floppy disk save icon. I think the staying power of these really comes from how ephemeral and formless digital tech can be.

    • ElectricMachman@lemmy.sdf.org
      link
      fedilink
      English
      arrow-up
      0
      ·
      27 days ago

      Reminds me of the device icons in Microsoft Intune.

      Android is represented by a rounded rectangle. Linux is a rounded rectangle, but yellow. iOS is, you guessed it, a rounded rectangle, but black.

      Windows, however, is a nice flat-screen monitor (read: a rounded rectangle on a stick)

  • tal@lemmy.today
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    1 month ago

    I like the look of vacuum-fluorescent displays (VFDs) – a high-contrast display with a black background, solid color areas. Enough brightness to cause some haloing spilling over into the blackness if you were looking at it. Led to a particular design style adapted to the technology, was very “high-tech” in maybe the 1980s.

    OLEDs have high contrast, and I suppose you could probably replicate the look, but I doubt that the style will come back.

    https://en.wikipedia.org/wiki/Vacuum_fluorescent_display

    • Domi@lemmy.secnd.me
      link
      fedilink
      arrow-up
      0
      ·
      1 month ago

      Many receivers and amplifiers still have VFD displays to this day. I still wonder why, LCD has to be significantly cheaper.

      They look cool as hell though, so I appreciate that they go the extra step.

      • toddestan@lemm.ee
        link
        fedilink
        arrow-up
        0
        ·
        30 days ago

        As someone who also likes VFDs, I’ve fully expected that they’d be extinct in new products by now thanks to cheap LCDs and OLED. But I find it awesome that they’re still hanging in there.

    • ggtdbz@lemmy.dbzer0.com
      link
      fedilink
      arrow-up
      0
      ·
      edit-2
      1 month ago

      Newer, but I quite like the gentle amber LCD (not LED) displays of my car. At night it’s bright enough and sharp enough without being visually loud. I wish more of these displays were still being made, I’d love to use them in car-centric Arduino projects and data displays that would be consulted at night or that sort of thing.

      I always ask my friends “How the fuck do you live like this?” when I hop into a car and the music UI is a garish color searing itself into my retinas permanently.

      Thankfully, advertising companies have identified this marginal comfort I find in the warm interior lighting of my car and have proceeded to mount insultingly blinding screens all over the city.

      The city being the midrise urban sprawl north of Beirut. What do you mean regulations on brightness habibi? You think you live in Paris? Imagine this: half the street is unlit because the power is out, but the advertising company’s invasive bullshit budget™ has enough foreign cash to burn to keep generators running all night for these shitty ads. Gotta beam an extra few kilowatts of photons straight into this sleepy driver’s eyeballs while they operate a motor vehicle, on a highway that a lot of people cross by foot. There’s a special on fish at the fancy supermarket, how will I live without that knowledge?


      Thankfully, the “state” of Israel has identified that the civilian structures of Lebanon mildly inconvenienced me, and has proceeded to

      • tal@lemmy.today
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        1 month ago

        Newer, but I quite like the gentle amber LCD (not LED) displays of my car. At night it’s bright enough and sharp enough without being visually loud. I wish more of these displays were still being made, I’d love to use them in car-centric Arduino projects and data displays that would be consulted at night or that sort of thing.

        Not sure if you mean VFDs or amber LCDs, but Matrix Orbital sells both sorts in small quantities that you’d use in a project and can interface to a microcontroller – I was interested in them myself when looking for small VFDs, years back. They’re going to be seven-segment or grid displays, though, not things with physical custom display elements like those car dash things, but that’s kinda part and parcel of small-run stuff.

        https://www.matrixorbital.com/

        https://www.matrixorbital.com/blc2021

        Just choose the “amber” option if it’s an amber LCD you want.

        Can also get their displays via Mouser or Digikey.

        • ggtdbz@lemmy.dbzer0.com
          link
          fedilink
          arrow-up
          0
          ·
          1 month ago

          That’s exactly the kind of display I’m talking about. Nice to see they’re still around.

          The ones I have are all just grids, higher resolution than these but still comfortingly blocky. I’ve actually replaced the dash display recently since the original one got deep fried under the sun and lost all contrast when the weather was above 20°C.

          • tal@lemmy.today
            link
            fedilink
            English
            arrow-up
            0
            ·
            1 month ago

            Ah, good to hear it. They do (or did, and I assume still do) also have higher res displays.

            Going back to an earlier bit in the conversation, where you were concerned about light sources in the car, I think that auto-dimming might also help (not just with VFDs, but the brightness of any in-car display). My car dash has the option to automatically set brightness based on ambient light levels (something that I wish my desktop computer monitor could do…part of “dark mode”'s benefit is a mitigation for devices that don’t do this). I don’t know if that was a thing back in the 1980s or so, when these display designs were popular.

            I also kind of wonder if eye-tracking, which has come a long way, could be made reliable-enough and responsive-enough to toggle off displays if the car can detect that a user is looking somewhere away from them. Maybe be conservative, not with some critical displays, but stuff like the radio or clock or something. Eye tracking systems normally use the near-infrared, as I understand it, not visible light, so I’d think that you could theoretically do it in a darkened car without problems.

  • rouxdoo@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    1 month ago

    I’m. sorry for your thread but in actual fact there is no older tech that is cooler than any modern tech - we get better tech because we get at the same tech as time goes by. Sorry, but your premise is flawed.

      • wagesj45@fedia.io
        link
        fedilink
        arrow-up
        0
        ·
        1 month ago

        I’m sorry, but your premise is flawed. The actual fact is that they are not fun at parties.

    • Count Regal Inkwell@pawb.socialOP
      link
      fedilink
      arrow-up
      0
      ·
      1 month ago

      And I am sorry that you apparently don’t know how to read

      Cooler and better are different concepts, and literally everyone else got the idea.

    • Noel_Skum@sh.itjust.works
      link
      fedilink
      arrow-up
      0
      ·
      1 month ago

      A wall full of 12” vinyl versus a hard drive full of flac files? We’re talking cooler - not better. Relax a bit, read the question again.

    • stoy@lemmy.zip
      link
      fedilink
      arrow-up
      0
      ·
      1 month ago

      Nope, your premise is flawed.

      You forget that “cool” does not equal “better”, cool is a vibe, better is a fact.

      Minidisc players are cool and futuristic, but an iPhone is a better music player, but not as cool.

      The starting scene of The Matrix would not have been as cool if Neo just handed the guy a USB stick, rather than a Minidisc.

    • 10_0@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      1 month ago

      I’m. sorry you couldn’t contribute to this conversation constructively but in fact there is nothing you could have added anyway - other people post better comments than you every time a minute goes by. Sorry, but your comments premise is flawed.

    • tal@lemmy.today
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      1 month ago

      That’s true in that absent very unusual cases, we don’t lose technology, so all the past technology remains. I think that it’s a valid insight.

      However, I think that it’s also true to say that there are technologies that – while not gone – fall into disuse because of a changing environment.

      You’re saying that a “better” technology will remain, and for certain definitions of “better”, I agree. We have no reason, absent maybe a changing environment that makes what is “best” different at different points in time or changing understanding of what is “best” (e.g. maybe internal combustion vehicles going away as we understand the impact of carbon dioxide emissions) to stop using a better technology.

      But OP is specific in distinguishing between “best” and “coolest”:

      I don’t mean BETTER. That’s a different conversation. I mean cooler.

      So I think that his question is valid.

  • deegeese@sopuli.xyz
    link
    fedilink
    arrow-up
    0
    ·
    edit-2
    1 month ago

    Older forms of computer RAM.

    Before integrated circuits, we had core memory which was a grid of wires and at each intersection was a little magnetic donut that held a single 1 or 0.

    https://en.wikipedia.org/wiki/Magnetic-core_memory

    Before that they had delay line memory, where they used vibrations traveling down a long tube of mercury, and more bits meant a longer tube to store a longer wave train.

    https://en.wikipedia.org/wiki/Delay-line_memory

    • grue@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 month ago

      Even though the story involves drum memory instead, your mention of delay-lines reminds me of The Story of Mel, a Real Programmer. Y’all should read the whole thing (it’s not long), but here’s a quick excerpt:

       Mel's job was to re-write
       the blackjack program for the RPC-4000.
       (Port?  What does that mean?)
       The new computer had a one-plus-one
       addressing scheme,
       in which each machine instruction,
       in addition to the operation code
       and the address of the needed operand,
       had a second address that indicated where, on the revolving drum,
       the next instruction was located.
      
       In modern parlance,
       every single instruction was followed by a GO TO!
       Put *that* in Pascal's pipe and smoke it.
      
       Mel loved the RPC-4000
       because he could optimize his code:
       that is, locate instructions on the drum
       so that just as one finished its job,
       the next would be just arriving at the "read head"
       and available for immediate execution.
       There was a program to do that job,
       an "optimizing assembler",
       but Mel refused to use it.