WelcomeUser Guide
ToSPrivacyCanary
DonateBugsLicense

©2025 Poal.co

With the exception of wifi, all computer tech we use today existed before the 80s. All that has happened since is we do more of it.

With the exception of wifi, all computer tech we use today existed before the 80s. All that has happened since is we do more of it.

(post is archived)

[–] 1 pt

You said the same thing I said just using different words.

I didn't read the comments. My bad if you felt I plagiarized you - that was not my intention even a little bit. But it must have seemed pretty shitty to you. I promise, not my intention at all.

When you say something from the future is impossible for something in the past, you are just pointing out that more of the same at a much higher level of resolution would be much more difficult at a lower level of resolution.

No, I'm saying it is literally impossible. Because the modern chips, the progress in AI, the progress and precision in robotics, and even progress in clean-environments, make it impossible to replicate even run-of-the-mill microprocessors from today. Because it requires all of those things to be in place to make these chips. In other words, they would need to replicate the chip to make a chip like it. But they need that tech to make a chip like it. They would be stuck in an endless loop of chicken and egg until that whole list caught up and then it would be useless, anyway, because they have the chip. They would also need to vastly innovate in the OS and computing architecture space because of the MASSIVE amount of changes we have made to how operating systems work. In other words, the computing power of a modern microchip would be wasted in their hands even if they could replicate it because they would not know how micro-architecture is utilized to make use of the processing power.

Here's an example where a small incremental change would have been possible to replicate (but it would require someone explain the Intellectual Property to them because it is more of a way of arranging micro architecture as well as the logarithms that govern how instructions sets are handed in the OS).

A great example of a huge incremental step that would have been in striking distance of the early 1970s computing technology would have been the introduction of RISC. They greatly improve computing efficiency as well as overall compute power. They could have used the technology from the past to implement these innovations.

power it would consume and heat it would generate.

I forgot about this. That's a very huge deal, too. We have made lots of progress in materials science, as well. That, however, they could steal without issue. Because it is just analyzing the materials and the ways they are arranged/used. Obviously, some of the assembly would be out of reach of their tech (see above) but some of it could be readily implemented on the quick-quick.

I was thinking about doing a 15-20 minute documentary on the history of computers from the Post-WWII era. You mentioned the huge amount of progress that occurred shortly after then. We made more computing innovations from 1950 to 1970 than we have from 2000 to 2020. It was that big of a deal, back then.