Not true. We were not aware of the direct applications/implications of Casimir Effect until this century for microcircuitry because it required miniaturization small enough to manifest in our transistors as a phenomena in quantum mechanics.
We have not possessed the precision, calculative ability, and artificial intelligence maturity to manufacture our modern microcircuitry until relatively recently. The processors we make, now, are so far and beyond advanced what we had before the 1970s that it would seem like alien technology to the computing engineers of yesterday. Even if we sent a middle of the road CPU to an electrical engineering team from 1965, it would be impossible for them to replicate that technology because they neither posses the coding maturity nor the robotics necessary to manufacture a chip of the same performance and efficiency.
Right now, our limits are quantum mechanics using transistor technology. We are coming up win inventive and genius ways around these actual limits in physics, of course. Some of those being 3D substrates instead of the traditional single-plane transistor setup. Other ways are smarter algorithms for moving data around the CPU. Memristor's were seen as one potential hope to replace transistors but that fell out of favor because of the fetch latency not being as good as SRAM, for example.
You said the same thing I said just using different words. Nothing new, just more of the same. Memristor was named in 1971. All of the math for everything we do today was invented in the early 1900s. Literally, none of the technology we use today would even remotely be close to alien technology to computer scientists of the past, they literally built all this shit at a slightly lower resolution. Instead they would go, oh neat, look what my invention grew into. I am sure you have spent some time reading up on super computers, mainframes, mini computers, etc. of the 50s, 60s, 70s and those eras ... mind boggling on two fronts:
1) The 50s were 10 years from ww2, a war won with rotary engine airplanes. The 60s were 20 years away from ww2 and on par with the sr71 and the moon landing.
2) Everything the super computers, mainframes, mini computers, etc of those eras do we do right now with a slightly different remix of simply more of the same off the shelf architectures. People think unix is something special because it runs everything in the world, what they don't realize is that it was basically an accident because to (genious level) programmers wanted to play some games in an environment spearate from the hardware, the bloody os just kinda jumped out of the lab and into the world. Then you dig just a wee bit and learn that Thompson and Ritchie worked / built Multics (1969) and you check that out and you go, holy fuck, there isn't even anything new in the operating system space. Even spaces of pure intellectual exploration are just a process of discovering universal mathematical primitives and building just more is more stuff on top of that.
When you say something from the future is impossible for something in the past, you are just pointing out that more of the same at a much higher level of resolution would be much more difficult at a lower level of resolution. Technically, everything we have today could be built with processor tech of the past, you would just need a lot more of it in terms of volume it would occupy in 3d space, power it would consume and heat it would generate.
As Yngvie Malmsteen always said, more is more.
The only new thing in tech is wifi really.
You said the same thing I said just using different words.
I didn't read the comments. My bad if you felt I plagiarized you - that was not my intention even a little bit. But it must have seemed pretty shitty to you. I promise, not my intention at all.
When you say something from the future is impossible for something in the past, you are just pointing out that more of the same at a much higher level of resolution would be much more difficult at a lower level of resolution.
No, I'm saying it is literally impossible. Because the modern chips, the progress in AI, the progress and precision in robotics, and even progress in clean-environments, make it impossible to replicate even run-of-the-mill microprocessors from today. Because it requires all of those things to be in place to make these chips. In other words, they would need to replicate the chip to make a chip like it. But they need that tech to make a chip like it. They would be stuck in an endless loop of chicken and egg until that whole list caught up and then it would be useless, anyway, because they have the chip. They would also need to vastly innovate in the OS and computing architecture space because of the MASSIVE amount of changes we have made to how operating systems work. In other words, the computing power of a modern microchip would be wasted in their hands even if they could replicate it because they would not know how micro-architecture is utilized to make use of the processing power.
Here's an example where a small incremental change would have been possible to replicate (but it would require someone explain the Intellectual Property to them because it is more of a way of arranging micro architecture as well as the logarithms that govern how instructions sets are handed in the OS).
A great example of a huge incremental step that would have been in striking distance of the early 1970s computing technology would have been the introduction of RISC. They greatly improve computing efficiency as well as overall compute power. They could have used the technology from the past to implement these innovations.
power it would consume and heat it would generate.
I forgot about this. That's a very huge deal, too. We have made lots of progress in materials science, as well. That, however, they could steal without issue. Because it is just analyzing the materials and the ways they are arranged/used. Obviously, some of the assembly would be out of reach of their tech (see above) but some of it could be readily implemented on the quick-quick.
I was thinking about doing a 15-20 minute documentary on the history of computers from the Post-WWII era. You mentioned the huge amount of progress that occurred shortly after then. We made more computing innovations from 1950 to 1970 than we have from 2000 to 2020. It was that big of a deal, back then.
(post is archived)