• 0 Posts
  • 10 Comments
Joined 1 year ago
cake
Cake day: June 28th, 2023

help-circle


  • “No frills” might be a bit gentle.

    Judging by other companies with similar outcomes, these are likely products made to meet the minimum legal definition of “vehicle,” and usually nonfunctional or minimally functional. The companies that built the “vehicles” often sell them to themselves (or rideshare subsidiaries), cashed in the Chinese tax credit, and immediately discard them. For an example of this in action, see the SEC filings and investigative articles around Kandi’s fake sales figures. Also see Out of Spec’s Kandi K27 review for what I mean when I say “nonfunctional.”

    The silver lining is that since the discarded EVs are basically made of tin foil with tiny batteries, it’s not as bad of a waste of natural resources as you might expect.


  • Sodium-ion chemistry, material sourcing, and manufacturing techniques are still in flux. Longevity is still an issue. They’re still a breakthrough innovation, not a solved problem.

    As it turns out, capitalism is better at driving iteration than innovation. Research into groundbreaking tech is expensive, risky, and the benefits tend to be spread out over entire industries, so private investors find it difficult to capitalize on (read: privatize) the benefits.

    There is still investment in optimizing NMC and LFP batteries not because “big lithium” has its hooks in people, but because low-risk patentable iterative improvement is all the private sector is really good for.

    This is why, if you dig deep enough, almost every “world-changing” technology you use today has its roots in government research or grants – microchips (US Air Force and NASA), accelerometers (Sandia Natl Labs, NASA), GPS (US DOD), touchscreens (Oak Ridge Natl Labs), the internet (ARPA), and even the lithium battery itself (NASA). The list goes on, and it gets particularly impressive when you look at medical breakthroughs.

    Today, the US DOE has its net spread wide, funding dozens of different battery chemistries. Argonne Natl Lab is working on Na-ion right now, among others. For mostly political reasons, US-funded research doesn’t “pick winners,” so they won’t ever truly go all-in on one tech.

    TL;DR: Na-ion batteries are still a breakthrough technology, so expect funding/research from state actors like the DOE or CATL to push it over the line before the private-sector investment floodgates open.


  • I’m not 100% convinced by some of the terrestrial applications for H2, on the economics side.

    In my opinion, the aviation industry can handle the cost increases inherent to greener fuel. People fly because it’s fast, not because it’s cheap. As long as the planes are still fast, there’s still a market.

    By contrast, people ride the bus because it’s cheap. According to Tokyo, H2 busses cost 2.6x as much to operate as diesel. According to Montpellier, H2 busses cost 6.3x as much to run as battery-electric busses (that’s including amortization). So while the tech seems like a great fit, the commercial case is weak.

    Shipping with semis is a toss-up. H2 can transport more cargo a longer distance than batteries, and I think some people will pay the premium for next-day shipping. But personally… I’d get the cheap-but-slow shipping 90% of the time.


  • Hydrogen works pretty well for aviation, though there are three main challenges they’re still working on: size, materals, and fuel source.

    Hydrogen is nice and lightweight, but the tanks and plumbing take up a lot of space, which cuts into cargo volume, basically limiting the range if you want to take passengers with you.

    The second issue is that fuel cells currently require quite a lot of platinum, and the PEM electrolysis also requires a lot of PGMs and rare metals like Iridium. The material scientists are working on this, and I figure if they can take the cobalt out of batteries, they can take the platinum out of fuel cells.

    The question that comes up the most when talking about hydrogen is where the hydrogen itself comes from. Right now, it’s mostly made by steam methane reformation or similar fossil fuel processing, which is nearly as bad for the environment as burning the fossil fuel directly. But there are promising advances in renewable electrolysis (such as taking advantage of peak solar for “free” electricity) which are closing the gap between SMR and renewable H2. It’ll never be as cheap as jet fuel, but it’s at least economically feasible.



  • It’s absolutely true that the training process requires downloading and storing images

    This is the process I was referring to when I said it makes copies. We’re on the same page there.

    I don’t know what the solution to the problem is, and I doubt I’m the right person to propose one. I don’t think copyright law applies here, but I’m certainly not arguing that copyright should be expanded to include the statistical matrices used in LLMs and DPMs. I suppose plagiarism law might apply for copying a specific style, but that’s not the argument I’m trying to make, either.

    The argument I’m trying to make is that while it might be true that artificial minds should have the same rights as human minds, the LLMs and DPMs of today absolutely aren’t artificial minds. Allowing them to run amok as if they were is not just unfair to living artists… it could deal irreparable damage to our culture because those LLMs and DPMs of today cannot take up the mantle of the artists they hedge out or pass down their knowledge to the next generation.


  • It doesn’t change anything you said about copyright law, but current-gen AI is absolutely not “a virtual brain” that creates “art in the same rough and inexact way that we humans do it.” What you are describing is called Artificial General Intelligence, and it simply does not exist yet.

    Today’s large language models (like ChatGPT) and diffusion models (like Stable Diffusion) are statistics machines. They copy down a huge amount of example material, process it, and use it to calculate the most statistically probable next word (or pixel), with a little noise thrown in so they don’t make the same thing twice. This is why ChatGPT is so bad at math and Stable Diffusion is so bad at counting fingers – they are not making any rational decisions about what they spit out. They’re not striving to make the correct answer. They’re just producing the most statistically average output given the input.

    Current-gen AI isn’t just viewing art, it’s storing a digital copy of it on a hard drive. It doesn’t create, it interpolates. In order to imitate a person’t style, it must make a copy of that person’s work; describing the style in words is insufficient. If human artists (and by extension, art teachers) lose their jobs, AI training sets stagnate, and everything they produce becomes repetitive and derivative.

    None of this matters to copyright law, but it matters to how we as a society respond. We do not want art itself to become a lost art.