Tesla Vision fails as owners complain of Model 3 cameras fogging up in cold weather::A number of Tesla owners have taken to Reddit after their front cameras fogged up and stopped working in cold weather, leaving several features, including the US$10,000 FSD Beta, inoperable. Tesla has declined to assist to these customers, despite many of their vehicles being covered under warranty.

  • /home/pineapplelover@lemm.ee
    link
    fedilink
    English
    arrow-up
    10
    arrow-down
    50
    ·
    1 year ago

    His argument makes sense. Human vision is not too different from just a camera. I see the argument for lidar but it can also be a bit more expensive to accomplish the same task. I’m open to listening to your argument as to why lidar technology would be a better path for self driving cars.

    • Flying Squid@lemmy.world
      link
      fedilink
      English
      arrow-up
      21
      ·
      1 year ago

      It seems to me that we want to make self-driving cars safer that human drivers. And to make them safer, you want them to use every kind of sensor that is practical to avoid accidents. LIDAR alone isn’t the path. Neither is visual alone.

      Also, suggesting that a car with cameras is equivalent to a human with a human brain that has eyes attached to it is a little silly.

    • nephs@lemmy.world
      link
      fedilink
      English
      arrow-up
      22
      arrow-down
      2
      ·
      1 year ago

      The eye is the fucking whole argument for the stupid creationism. The most complex piece of machinery in the human body and shit.

      That man thinks he’s god, to create similar functionality.

      Has he fucking tried to keep his eyes open in fucking cold weather?

      Why not just use humans eyes outside of earth’s atmosphere?!

      He’s just so fucking stupid. Rich and stupid. The shit he spends his “hard earned” money would be so much better and efficient if spent controlled by mostly anyone else.

    • accideath@lemmy.world
      link
      fedilink
      English
      arrow-up
      21
      arrow-down
      2
      ·
      1 year ago

      Humans don’t just use eyes when driving. Sound, and touch also play big roles, for example when it comes hearing ambulances nearing or to feel road conditions. And we have a really good sense for depth and distance, that’s much harder to replicate with just cameras. And even Humans aren’t allowed to drive with headphones on (at least here in Germany), because it’s dangerous to limit the amount of sensors available to us.

      Besides that, even our sight is faaaar from perfect either and there are quite a lot of accidents caused by drivers just not seeing another driver or some other obstacle. Our vision is pretty good, yes, but the amount of guessing our brain has to do for us to actually see what we do isn’t exactly small.

      I don’t know about you, but I would prefer a self driving vehicle to be safer than a human. Because if it isn’t, why bother? And how could it be safer, if it uses less information than humans, who are shit drivers already?

      And yes, lidar is more expensive but so what? It’s cheap enough to add it to phones. Expensive phones, yes but in the grand scheme of things, they’re still quite a bit cheaper than a car and Teslas aren’t exactly cheap cars either. And Tesla used to include radar in their cars until they didn’t. And the cars didn’t get that much cheaper…

      And to give a positive example: Mercedes Benz are the first to launch a Level 3 autonomous vehicle. And guess what? It uses Lidar, audio sensors, road condition sensors, etc. and actually achieved L3 autonomy, while Tesla’s FSD is constantly tested to be one if the worst performing Level 2 systems in the industry, despite their claims of greatness…

      • Socsa@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        1
        ·
        1 year ago

        Lidar is not just more of expensive, it is extremely fragile in a vehicle which is bouncing around at highway speeds.

        • accideath@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          3
          ·
          1 year ago

          Well, doesn’t seem to bother any other car manufacturer much. Probably because the benefits outweigh the complexity disadvantages

    • aesthelete@lemmy.world
      link
      fedilink
      English
      arrow-up
      17
      arrow-down
      3
      ·
      1 year ago

      Human vision is not too different from just a camera.

      Oh yeah, human vision also causes people to mistake a blue truck for the sky and drive right into it. /s

        • aesthelete@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          4
          ·
          edit-2
          1 year ago

          Sure but usually because they weren’t looking or couldn’t see it…not because they mistook a truck for the sky or some of the other dumb shit computer vision algorithms do.

            • aesthelete@lemmy.world
              link
              fedilink
              English
              arrow-up
              3
              arrow-down
              2
              ·
              edit-2
              1 year ago

              Not seeing something and mistaking something for another thing are pretty different problems. One can be corrected with glasses while correcting the other requires a brain transplant (or a brain in the first place).

              Edit: or, ya know adding another sensor would work and make it so the vision system wouldn’t have to be so good at object recognition and could just not hit things…but we can’t add the couple hundred dollars worth of parts for that.

    • loutr@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      16
      arrow-down
      2
      ·
      1 year ago

      The obvious argument is that eyes are far from perfect and fail us all the time, especially when going fast. We are quite good at making up for it, but saying “We have eyes so my self driving cars will have eyes too” is pretty fucking dumb.

      • ItsMeSpez@lemmy.world
        link
        fedilink
        English
        arrow-up
        7
        ·
        1 year ago

        We also recognized that we need to keep our windshields clear of fog in order for our eyes to work properly.

    • GoodEye8@lemm.ee
      link
      fedilink
      English
      arrow-up
      13
      ·
      1 year ago

      That argument doesn’t make sense because human vision isn’t that great either. When it’s dark or raining or snowing or foggy our vision is pretty shit.

      I’m not saying LIDAR is better but rather point out that actually you want different types of sensors to accurately assess the traffic, because just one type of sensor isn’t likely to cut it. If you look at other manufacturers they’re not using only LIDAR or only camera. Some use LIDAR + camera, some user RADAR + camera, some user LIDAR, RADAR and camera. And I’m pretty sure that as manufacturers will aim for higher SAE levels they will add even more sensor into their cars. It’s only Tesla who thinks they can somehow do more with less.

      • /home/pineapplelover@lemm.ee
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        I think it’s undeniable the combination of camera and lidar will be the best solution. I just hope this can be coss effective. Maybe over time we can be able to adapt and improve the technology and make it more economical so that it is safer for our roads.

      • Socsa@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        2
        ·
        1 year ago

        People here have no idea what they are talking about, or how absurdly difficult it is to actually deploy lidar to a consumer vehicle. There’s a reason why Tesla is shipping more advanced driver assist tech than anyone else, and it’s because they went against the venture capitalist Lidar obsession which is holding everyone back. There’s a reason why there are basically zero cars shipping with lidar today.

        You don’t need mm depth maps to do self driving. Not that you get that from lidar on rough roads anyway.

        • /home/pineapplelover@lemm.ee
          link
          fedilink
          English
          arrow-up
          2
          ·
          1 year ago

          There are some test cars with lidar. It has the spinny thing on top and looks pretty interesting. I believe those cars are pretty successful. I don’t think they’re being mass produced though, because the costs might be a little prohibitive.

        • learningduck@programming.dev
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          1
          ·
          1 year ago

          The most advanced that’s not even on autonomous level 3. It’s funny that Mercedes is the first to get level 3 approval in California and they don’t even boasting that as much.

          That aside, a secondary sensor that help verifying if the vision get it right would be nice. It could be just a radar or whatever. Imagine if the vision fail to recognize a boy in a Halloween costume as a person, at least the secondary sensor will the car to stop due to contradict perception.

          • GoodEye8@lemm.ee
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 year ago

            I might be misremembering but I think Teslas are actually more capable, they’re just deliberately stating they’re SAE level 2 so they could skirt the law and play loose and dangerous with their public beta test.

            • learningduck@programming.dev
              link
              fedilink
              English
              arrow-up
              1
              ·
              1 year ago

              I haven’t researched this enough, but Tesla says that they are level 3, but never bother to get the actual approval is like how I kept saying that I’m smart, but too lazy back in my school years.

              Put your money where your mouth is. Life are at stake here.

    • learningduck@programming.dev
      link
      fedilink
      English
      arrow-up
      10
      ·
      edit-2
      1 year ago

      Think of that Coyote and the roadrunner cartoon. If there’s a graffiti that looks like a tunnel the coyote may run into the tunnel based on vision alone, but a secondary sensor will help telling that there’s a wall.

      Irl, If the vision failed to recognize that there’s something on the road, at least a secondary sensor will protest that there’s something on the road.

      • HERRAX@sopuli.xyz
        link
        fedilink
        English
        arrow-up
        7
        arrow-down
        1
        ·
        edit-2
        1 year ago

        You can also test driving in direct sunlight without sunglasses or the suncover. You get notifications and beeping noises whenever the sun hits them directly, making the lane assist (I refuse to call it autopilot) quite irrational in most weather… It’s actually worse for me than driving in cold weather.

    • poopkins@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      10
      ·
      1 year ago

      While I disagree with you that you think his argument makes sense, I’m upvoting your comment because it encourages discourse and provides more insight and depth to this topic. I wish more people on Lemmy did the same.