pengbuzz Posted yesterday at 08:17 AM Posted yesterday at 08:17 AM On 6/3/2025 at 10:38 PM, Big s said: Either that or someone will do some unethical thing by cloning a brain and hooking it up to a suv 8 hours ago, Seto Kaiba said: That would present its own problems... not just in terms of how the system translates the inputs and outputs from the living neural tissue into usable machine commands, but in terms of what a massive pain in the arse it is to keep living tissue alive and healthy outside of a body. It'd be a real mess if your car stopped working because its computer got hypothermia, or heat stroke, or just plain died due to lack of maintenance. One of the virtues of inorganic computing is that the hardware can continue to operate for extended periods in environments that would kill a human. (Regular consumer cars are tested in hot and cold cells for extended periods at temperatures that would be fatal if you were in there with it for the duration of the test. I actually used to spend a lot of time in one of those labs... got powerfully ill repeatedly going from the -40 degrees C cold cel to the 50 degrees C hot cell.) Not to mention needed glands such as the thyroid and pancreas, as well as the liver and kidneys. You'd have to maintain oxygenation, nutrition/ hydration, dialysis and the complex mix of hormones, enzymes and such. Also, the immune system would be utterly wrecked, and that brain would need continual infusions of red blood cells (the brain has no capability to make it's own, nor white cells and antibodies). That leads us to health: what if the brain develops some kind of disease such as cancer or Parkinson's? Or the vehicle gets into a crash and it ends up with a TBI? It's easy to say "we'll just replace it after scanning it", but you have to remember that some injuries and illnesses don't show up right away. And that can lead to instances where the vehicle ends up making catastrophic choices. Speaking on that: without visual/ tactile/ audio input to learn language, rational thinking and decision making skills, judgment, morality ("it's WRONG to run someone in the crosswalk over because they took a few seconds too long!"), you'd have a virtual newborn trying to drive a 2 1/2 ton vehicle around! On that note: what if the brain has a "bad day" or a temper tantrum? We've all seen children act up in public (restaurants and retail stores, etc.); would you want that in control of a 5,000 lb vehicle when it decides it "doesn't want to play nice" anymore? I know there's probably "ways to control" these things that folks will bring up, or that these concerns "aren't so much of a concern"; I don't claim to be an expert on these things. But I know enough so safely say that much of this AND the moral implications (along with Seto's breakdown of the issues with it thus far) are more than enough to render the idea of a "living brain" in a self-driving car something that you'd really not want to happen. Quote
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.