We've had an extremely light discussion around AI with artists complaining realising they too can be automated, we had the concept of a literal baby making facility, I'm assuming genetic rewriting for immortality is next week...
But getting away from the leftist takes on these subjects and the meming on facing a skynet/matrix future, what are the real opinions you have on this kind of tech?
Personally, with AI it's a Pandora's box, if we CAN create sentient artificial life, best hope is not to do it but if a dumbass does we imbrace and integrate that being, as starting conflict will probably be the reason we die.
As for artificial wombs, having the tech is needed but not in this commercial sense that video presentation gave, more as a 'last resort, literally required to save humanity' sense.
These are the two ones mentioned this week alone but any other tech you put on the 'forbidden' side or your takes on the ones disscussed.
A base pair is two bits (A, T, C, G), so 800 MiB of data.
Actual genes that code for things are maybe 10% of DNA, and around 1/3 of genes have to do with the brain. The other 90% isn't actual 'junk' but it isn't code. So maybe something like 25 MiB.
Most of that is probably not algorithm for intelligence, but controlling the actual physical processes and hardcoded things like instincts, bonding, etc (we have specific reactions hardcoded for spiders and snakes for instance).
So likely some small fraction of 25 MiB. That's pretty small.
The self-organization process is the code. The hardware part is irrelevant because an AI will be running on totally different hardware.
General intelligence fundamentally comes about from being able to adapt and compete in your environment and that started at least 500 million years ago (Cambrian explosion). Every animal has the basic algorithm for general intelligence, just constrained by size and programmed behaviors. It didn't parallel evolve in humans and octopus.
So how big was the genome of animals way back then? I would expect a lot smaller.
I'd guess a general purpose AI algorithm to easily fit in say 10,000 lines of code.
What a silly and arbitrary conversion. The letters are an abstraction, I could just as easily say that each base pair is 660 bits since that's average the number of distinct atoms within them. Now that estimation is out by nearly 3 orders of magnitude. But no, an instruction-to-instruction comparison is a far more valid rationale than a data size one, one that isn't muddied by vastly different storage efficiencies between systems.
The idea that only 10% of DNA actually does something is junk pseudo science similar to the stupid "humans only use 30% of their brain".
You don't eliminate setting up definitions, return instructions and anything that isn't an if-then statement as "not code" do you? Then apply that method equally to biological systems.
Hence the < in my original statement. I can maybe understand arguing for not including motor control in biological intelligence, but trying to remove 'instinct' from intelligence is crazy to me.
It's relevant when you keep insisting on measuring things by imagined storage size. Self organising systems can have an incredibly tiny storage footprint compared to their design complexity.
The nucleotides are the basic indivisible unit, and there's four of them in nature (uracil isn't used in DNA). Four choices, two bits of information. To invoke atoms is completely irrational and you're only suggesting that to desperately have some reason you can convince yourself that you're not wrong about the size.
Summary of what science says, functional DNA is 10% to 20%, depending on definition used.. That's not pseudo-science, you just don't want to believe for whatever reason that the relevant parts are small.
I'm not. Only 1% of DNA codes for proteins, this other 9% is the "if-then statements". The other 90% are more like UI, asserts, printf statements in that analogy - they do something, they're part of the code, but not part of the algorithm.
We're probably in agreement here because making an intelligent AI that's not a total psycho is probably the bulk of the "algorithm". Even going from Lore to Data in ST:TNG is going to be way harder than the actual intelligence part.
I'm not measuring "things" by storage size, I'm measuring the algorithm size and I've explained by reasoning why it's necessarily very small in computing terms. Of course an actual human-level intelligent AI will use gigaquads of storage and perform massive computation.
But how much CPU/memory the algorithm uses isn't material to limiting the spread of that information. If the algorithm is the threat to our existence then that's what has to be prevented from being passed around.