I saw this article, which made me think about it…

Kids under 16 to be banned from social media after Senate passes world-first laws


Seeing what kind of brainrot kids are watching, makes me think it’s a good idea. I wouldn’t say all content is bad, but most kids will get hooked on trash content that is intentionally designed to grab their attention.

What would be an effective way to enforce a restriction with the fewest possible side effects? And who should be the one enforcing that restriction in your opinion?

  • orcrist@lemm.ee
    link
    fedilink
    arrow-up
    0
    ·
    27 days ago

    It doesn’t matter what you think. Kids will do what they want to do and that’s that, so everything else is a question of how much time and money and posturing people want to do.

    • RememberTheApollo_@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      27 days ago

      What should it be called? Agreed that the vast majority of it is a dumpster fire of lies and brain rot, but what would you rename it?

  • Dave@lemmy.nz
    link
    fedilink
    arrow-up
    0
    ·
    edit-2
    28 days ago

    I can’t remember which article I was reading, probably one on Lemmy, but it said that we know social media algorithms are bad for people and their mental and physical health, that they are divisive, drive extremism, and just in general are not safe for society.

    Drugs are regulated to ensure they are safe, so why aren’t social media algorithms regulated the same way? Politicians not understanding the technical details of algorithms is not an excuse - politicians also don’t understand the technical details of drugs, so they have a process involving experts that ensures they are safe.

    I think I’m on the side of that article. Social media algorithms are demonstrably unsafe in a range of ways, and it’s not just for under 16s. So I think we should be regulating the algorithms, requiring companies wishing to use them to prove they are safe before they do so. You could pre-approve certain basic ones (rank by date, rank by upvotes minus downvotes with time decay like lemmy, etc). You could issue patents to them like we do with drugs. But all in all, I think I am on the side of fixing the problem rather than pretending to care in the name of saving the kids.

    • orcrist@lemm.ee
      link
      fedilink
      arrow-up
      0
      ·
      27 days ago

      I recall that some years ago Facebook was looking into their algorithm and they found that it was potentially leading to overuse, which might be what you’re thinking of, but what actually happened is that they changed it so that people wouldn’t be using Facebook as much. Of course people who are opposed to social media ignored the second half of the above statement.

      Anyway, when you say the algorithms are demonstrably unsafe, you know you’re wrong because you didn’t demonstrate anything, and you didn’t cite anyone demonstrating anything. You can say you think they’re unsafe, but that’s a matter of opinion and we all have our own opinions.

      • Dave@lemmy.nz
        link
        fedilink
        arrow-up
        0
        ·
        edit-2
        27 days ago

        I recall that some years ago Facebook was looking into their algorithm and they found that it was potentially leading to overuse, which might be what you’re thinking of,

        No, it was recent, and it was an opinion style piece not news.

        but what actually happened is that they changed it so that people wouldn’t be using Facebook as much.

        Can you back this up? Were they forced to by a court, or was this before the IPO when facebook was trying to gain ground and didn’t answer to the share market? I can’t imagine they would be allowed to take actions that reduce profits, companies are legally required to maximise value to shareholders.

        Anyway, when you say the algorithms are demonstrably unsafe, you know you’re wrong because you didn’t demonstrate anything, and you didn’t cite anyone demonstrating anything. You can say you think they’re unsafe, but that’s a matter of opinion and we all have our own opinions.

        I mean it doesn’t take long to find studies like A nationwide study on time spent on social media and self-harm among adolescents or Does mindless scrolling hamper well-being? or How Algorithms Promote Self-Radicalization but I think this misses the point.

        You’ve grabbed the part where I made a throwaway comment but missed the point of my post. Facebook is one type of social media, and they use a specific algorithm. Ibuprofen is a specific type of drug. Sometimes ibuprofen can be used in a way that is harmful, but largely it is considered safe. But the producers still had to prove it was safe.

        • orcrist@lemm.ee
          link
          fedilink
          arrow-up
          0
          ·
          27 days ago

          Here’s one example of Facebook adjusting its algorithm several years ago. You can remark that it ought to do more, and I may agree with you, but that’s totally different from saying it doesn’t do anything positive. https://www.washingtonpost.com/technology/interactive/2021/how-facebook-algorithm-works/

          If your argument is that there can be drawbacks to using social media, I think everyone agrees. But remember, we were told horror stories about pinball, pool, comic books, chewing gum, Dungeons and Dragons, the list goes on and on. So with that in mind, I hope you can understand why I’m not convinced by a few studies that social media is net negative in value.

          And the reason we have laws requiring careful drug testing is because of damage that was done in the past, proven damage that actually happened, people whose lives ended short because they were doing things like imbibing radioactive chemicals. Your suggestion that we ought to treat social media the same is putting the cart before the horse. The burden of proof is on you, not on social media companies.

          • Dave@lemmy.nz
            link
            fedilink
            arrow-up
            0
            ·
            26 days ago

            I think we ultimately have different beliefs about how things should work. I think companies should prove their products are safe, you think things should be allowed unless you can prove it’s not safe.

            I get it, and I think it’s OK to have different opinions on this.

  • shortwavesurfer@lemmy.zip
    link
    fedilink
    arrow-up
    0
    ·
    28 days ago

    Absolutely not. Anything you put in is likely going to have privacy issues for both adults and children, and you forget how smart children are. I know we had firewalls and all kinds of shit when I was in school, and I was the person who taught everybody else how to bypass them in like five minutes. There is not a filter in the world you can put up that are going to keep children from the content that they actually want to look at.

    • Im_old@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      28 days ago

      Have you ever heard of the great firewall of China? It’s always a budget issue, not a technical issue. We can block what we want with the right resources.

      • shortwavesurfer@lemmy.zip
        link
        fedilink
        arrow-up
        0
        ·
        28 days ago

        I think the better question is who has not heard of the Great Firewall of China, but it can still be bypassed. In fact, I’ve even spoken on a podcast with somebody from China by passing the firewall while we were talking.

  • transscribe7891@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    0
    ·
    28 days ago

    I agree with the under-16 social media ban, but figuring out how and who implements is definitely going to be the hard part. Ideally it would be parents first, but that’s been the status quo up to now and it hasn’t worked. And as someone who has been “18+” online since I was 10… raising the age limit on the services themselves is only going to work to a certain extent. I’m very curious to see how this plays out.

  • givesomefucks@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    28 days ago

    I mean, you can’t really do it without parents.

    But there could be a law that any phone tied to a number a minor possesses is locked down so it can’t install the apps. It wouldn’t stop web based, but apps seem to be a worse problem for various reasons.

    It’s not even so much the content that’s the problem, it’s the delivery mechanism, how it effects dopamine release, and how damaging those changes can be to a developing brain.

    Its similar to the lootbox system that was regulated in various countries. Human brains will keep trying the next item in their feed because there’s a chance something good shows up. If every post was good it would actually cause less addiction.

    But a child has shit tier impulse control. They’ll going to keep pulling the proverbial level forever, wading thru shit for the slightest dopamine hit. All the meanwhile still being influenced by what they scroll past.

    • WoahWoah@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      28 days ago

      Yes, parents obviously still pay an important role. But we regulate many things for people under the age of 18 to generally good effect.

  • NeoNachtwaechter@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    edit-2
    28 days ago

    How? Make it a crime for every manager of the social media companies to let a child in.

    fewest possible side effects?

    No. That is not a goal when it is about child protection.

  • cRazi_man@lemm.ee
    link
    fedilink
    arrow-up
    0
    ·
    edit-2
    28 days ago

    Why only kids? We all need to be protected from social media.

    I don’t know to suggest good national policy, but I think social media has these:

    • controls on how far you can doom scroll
    • being able to opt out of algorithms by seeing things in time order and from optionally only white listed sources and allowing block lists in a variety of ways, etc, etc
    • heavy moderation of blatantly illegal content.
    • heavily curated advertising (or none at all, users can pay)
    • separation of political content (maybe a system of tagging so topics that are not of interested can be hidden…maybe this could be crowd sourced)
    • Strict control of data collection
    • The ability to delete/be forgotten

    I don’t know how propaganda or corporate interests can be excluded, but that would be ideal.

  • 🇰 🌀 🇱 🇦 🇳 🇦 🇰 ℹ️@yiffit.net
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    28 days ago

    I don’t think kids should be barred from social media, since at its core, social media is just people talking and sharing things with each other.

    The problem is not with the medium or generally who is using it, it’s with the rate of consumption, poor parenting and poor moderation.

    I also think it is an even larger problem to enforce in the first place, since it will destroy one of the good things about the Internet: anonymity. Seeing as the only way to truly enforce an age restriction is to require ID to be given to verify a user’s identity. I’m not as super hardcore about my privacy like some parts of Lemmy are, but this is one thing I absolutely do not want to see happen.

  • Lvxferre@mander.xyz
    link
    fedilink
    arrow-up
    0
    ·
    28 days ago

    I don’t think that kids should be banned from social media. Instead they should be taught how to handle it in an individually and socially healthy way. Namely:

    • how to spot misinformation
    • how to spot manipulation
    • how to protect yourself online
    • how to engage constructively with other people
    • etc.

    This could be taught by parents, school, or even their own peers. But I think that all three should play a role.

    • Surp@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      edit-2
      27 days ago

      That’s something most children can’t understand it’s basically adding up to an entire multi years school course what you’re proposing and the way the education system is going in many countries id say good luck. It’s not like as easy as saying oh little Charlie that’s fake info because you should read xyZ scientific papers on climate change. Kids are fucking stupid even while going to school. People are constantly coming up with new ways to trick people and kids are above all the easiest to trick.

      • Lvxferre@mander.xyz
        link
        fedilink
        arrow-up
        0
        ·
        27 days ago

        That’s something most children can’t understand

        We’re talking about children and teens. A 6yo eating bullshit is natural; a 13yo doing it should not. Please don’t be disingenuous, stop oversimplifying = distorting things.

        it’s basically adding up to an entire multi years school course what you’re proposing

        Full stop here. That is not even remotely close to what I said, stop lying.

        I’m not going to waste my time further with you.

        • Surp@lemmy.world
          link
          fedilink
          arrow-up
          0
          ·
          edit-2
          27 days ago

          Teens are also stupid it is easy to simplify you probably don’t have children nor work around them. They dont need social media so early. Spotted the Russian Facebook employee. You’re a waste of everyone’s time.

  • FlashMobOfOne@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    28 days ago

    Yes. I hope the rest of the world will begin addressing the issue.

    There’s a wealth of information linking negative mental health to social media use (hell, read stories about QAnon), and I look at regulating social media among kids the same way we regulate cigarette smoking. Will it be perfect? No. But that doesn’t mean it isn’t worth doing.

  • Cochise@lemmy.eco.br
    link
    fedilink
    arrow-up
    0
    ·
    28 days ago

    We can’t regulate half a dozen corporations, prohibiting algorithmic feed and targeted ads, so we will ban millions from using the apps with these features.

  • ERROR: Earth.exe has crashed@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    28 days ago

    I don’t think the government should be banning kids from using certain parts of the internet because of perceived harm. Kids need to explore the internet for themselves in order to learn first-hand about the dangers. I mean, the same argument for social media can be applied for everything on the internet. Search engine results are full of scams, are be banning search engines too? News sites are full of misleading information, lets ban news sites? So the only source of information is now from schools who can also be biased and in some places are just regurgitating government propaganda. In red states, schools are constantly telling LGBT+ kids that they are commiting “sins” and they are “mentally ill”, and they might have very conservative parents with no sympathy. Are we really gonna stop kids from going online and seek support? If kids can’t even be allowed to explore the digital world, how are we also allowing them to explore the physical world, where there are physical dangers?

    In an ideal scenario, kids should be allowed to freely explore the internet, but should have parents that they trust to talk to in case they face any danger or harassment, so the parents can help them deal with it.

    Eventually, kids are gonna grow up, and a kid with zero online experience their entire life suddenly gaining free access to all of the internet is a recipe for disaster.

    Its like not teaching kids sex ed, then when they get old enough, they’ll end up having unprotected sex.

    Edit: Not to mention, social media ban is not very enforcible. Even in China, an authoritarian regime, is unable to stop kids from gaming, they just steal their grandparents IDs and play anyways. Do y’all really want a democratic country to suspend civil rights and start privacy intrusions?

  • freethemedia@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    0
    ·
    28 days ago

    Controversial opinion:

    In the future we are going to look back on seeing children use iPads that directly connect them to the most sophisticated engagement and manipulation algorithms ever as something as horrid as a child smoking a cigarette, or doing any other drug

    Now obviously this is an issue, but many of the suggested solutions are lacking.

    Remember: the phones in our pocket are turing complete, any software solution can be undone by another software solution

    Hardware flaws baked into chipsets will be inevitably exploited by the worst of humanity

    What we need is a LEGAL framework to this issue

    We need to see that allowing a full 5g 2.5ghZ portal to the unknown is simply absolutely harmful for a child to get there hands on without parental or educational supervision

    I suspect it really should work like regulating a drug, allow more and more unsupervised compute and networking as the child ages

    That way kids can still have dumb phones for basic safety and comms.

    I suspect laws will be applied like alcohol within the home, to allow for family owned game systems and such

    But lapses that lead to actual demonstrated harm such as mental illness leading to bodily harm or violence due to radicalization need to be treated as if a parent just fed their child alcohol without care. Or at least enabled them to it if it’s evident that they didn’t even try

    Straight up it’s also a cultural shift 13-16 yr olds gaming at home under parental guidance, but not being bought significant personal compute since it would not be sold to them or for the purpose of giving to them

    Usage in school all fine and good , but seeing babies with iPads at the mall seen as badly as letting them smoke (and the secondhand smoke from all the brainrot leading to brainrotted adult)

    • freethemedia@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      28 days ago

      I really am curious if anyone could demonstrate a link to the amount of access to compute and network bandwidth as a child ages, and the incidence of anxiety, social, or mood disorders.

      One of the things I feel really thankful for is that the available compute and network I had access to grew up with me essentially, allowing me to generally see the harms of full scale manipulating social algorithms and avoid them.

      I feel like my mental health has been greatly benefitted by staying away from such platforms.

      • freethemedia@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        28 days ago

        This isn’t even like a social media only thing. There’s so many worse things a kid could get their eyes and ears on with the compute we just hand them Willy nilly

    • otp@sh.itjust.works
      link
      fedilink
      arrow-up
      0
      ·
      28 days ago

      In the future we are going to look back on seeing children use iPads that directly connect them to the most sophisticated engagement and manipulation algorithms ever as something as horrid as a child smoking a cigarette, or doing any other drug

      Are we looking at video games this way now?

      • Stovetop@lemmy.world
        link
        fedilink
        arrow-up
        0
        ·
        28 days ago

        Depends on the game. Some of them, absolutely. Roblox is one that comes to mind, probably Fortnite as well. And one shouldn’t even start on mobile games.

  • stinky@redlemmy.com
    link
    fedilink
    English
    arrow-up
    0
    ·
    28 days ago

    Me: there should be an agency like the FDA that brands news and other media with veracity labels according to guidelines we as voters agree on to prevent fake news and misinformation

    Them: YOU CAN’T BECAUSE OF FREE SPEECH DIE HEATHEN DIE

    Me: ok what about banning kids from social media?

    Them: that’s fine :)

    Hypocrites.