WelcomeUser Guide
ToSPrivacyCanary
DonateBugsLicense

©2025 Poal.co

With the exception of wifi, all computer tech we use today existed before the 80s. All that has happened since is we do more of it.

With the exception of wifi, all computer tech we use today existed before the 80s. All that has happened since is we do more of it.

(post is archived)

[–] 1 pt

Most airplane tech is 100 years old, we have just refined it

Rockets were invented by the Chinese 1000 years ago... they still work the same way today, just bigger.

[–] 0 pt

Not true. We were not aware of the direct applications/implications of Casimir Effect until this century for microcircuitry because it required miniaturization small enough to manifest in our transistors as a phenomena in quantum mechanics.

We have not possessed the precision, calculative ability, and artificial intelligence maturity to manufacture our modern microcircuitry until relatively recently. The processors we make, now, are so far and beyond advanced what we had before the 1970s that it would seem like alien technology to the computing engineers of yesterday. Even if we sent a middle of the road CPU to an electrical engineering team from 1965, it would be impossible for them to replicate that technology because they neither posses the coding maturity nor the robotics necessary to manufacture a chip of the same performance and efficiency.

Right now, our limits are quantum mechanics using transistor technology. We are coming up win inventive and genius ways around these actual limits in physics, of course. Some of those being 3D substrates instead of the traditional single-plane transistor setup. Other ways are smarter algorithms for moving data around the CPU. Memristor's were seen as one potential hope to replace transistors but that fell out of favor because of the fetch latency not being as good as SRAM, for example.

[–] 0 pt (edited )

You said the same thing I said just using different words. Nothing new, just more of the same. Memristor was named in 1971. All of the math for everything we do today was invented in the early 1900s. Literally, none of the technology we use today would even remotely be close to alien technology to computer scientists of the past, they literally built all this shit at a slightly lower resolution. Instead they would go, oh neat, look what my invention grew into. I am sure you have spent some time reading up on super computers, mainframes, mini computers, etc. of the 50s, 60s, 70s and those eras ... mind boggling on two fronts:

1) The 50s were 10 years from ww2, a war won with rotary engine airplanes. The 60s were 20 years away from ww2 and on par with the sr71 and the moon landing.

2) Everything the super computers, mainframes, mini computers, etc of those eras do we do right now with a slightly different remix of simply more of the same off the shelf architectures. People think unix is something special because it runs everything in the world, what they don't realize is that it was basically an accident because to (genious level) programmers wanted to play some games in an environment spearate from the hardware, the bloody os just kinda jumped out of the lab and into the world. Then you dig just a wee bit and learn that Thompson and Ritchie worked / built Multics (1969) and you check that out and you go, holy fuck, there isn't even anything new in the operating system space. Even spaces of pure intellectual exploration are just a process of discovering universal mathematical primitives and building just more is more stuff on top of that.

When you say something from the future is impossible for something in the past, you are just pointing out that more of the same at a much higher level of resolution would be much more difficult at a lower level of resolution. Technically, everything we have today could be built with processor tech of the past, you would just need a lot more of it in terms of volume it would occupy in 3d space, power it would consume and heat it would generate.

As Yngvie Malmsteen always said, more is more.

The only new thing in tech is wifi really.

[–] 1 pt

You said the same thing I said just using different words.

I didn't read the comments. My bad if you felt I plagiarized you - that was not my intention even a little bit. But it must have seemed pretty shitty to you. I promise, not my intention at all.

When you say something from the future is impossible for something in the past, you are just pointing out that more of the same at a much higher level of resolution would be much more difficult at a lower level of resolution.

No, I'm saying it is literally impossible. Because the modern chips, the progress in AI, the progress and precision in robotics, and even progress in clean-environments, make it impossible to replicate even run-of-the-mill microprocessors from today. Because it requires all of those things to be in place to make these chips. In other words, they would need to replicate the chip to make a chip like it. But they need that tech to make a chip like it. They would be stuck in an endless loop of chicken and egg until that whole list caught up and then it would be useless, anyway, because they have the chip. They would also need to vastly innovate in the OS and computing architecture space because of the MASSIVE amount of changes we have made to how operating systems work. In other words, the computing power of a modern microchip would be wasted in their hands even if they could replicate it because they would not know how micro-architecture is utilized to make use of the processing power.

Here's an example where a small incremental change would have been possible to replicate (but it would require someone explain the Intellectual Property to them because it is more of a way of arranging micro architecture as well as the logarithms that govern how instructions sets are handed in the OS).

A great example of a huge incremental step that would have been in striking distance of the early 1970s computing technology would have been the introduction of RISC. They greatly improve computing efficiency as well as overall compute power. They could have used the technology from the past to implement these innovations.

power it would consume and heat it would generate.

I forgot about this. That's a very huge deal, too. We have made lots of progress in materials science, as well. That, however, they could steal without issue. Because it is just analyzing the materials and the ways they are arranged/used. Obviously, some of the assembly would be out of reach of their tech (see above) but some of it could be readily implemented on the quick-quick.

I was thinking about doing a 15-20 minute documentary on the history of computers from the Post-WWII era. You mentioned the huge amount of progress that occurred shortly after then. We made more computing innovations from 1950 to 1970 than we have from 2000 to 2020. It was that big of a deal, back then.

[–] 0 pt

The government is roughly 15 years ahead of everyone else to maintain their control. If you had the technology they did you could spy on anyone you want or even sabotage them. (((They))) don't want that.

[–] [deleted] 4 pts

The government is roughly 15 years ahead of everyone else to maintain their control.

That's the Hollywood propaganda universe. In the real world the gov't is a clusterfuck at all levels and they are more like 15 years behind everyone else.

[–] 0 pt

Sometimes I wonder. They have tech the average Joe could never acquire, but yeah, I agree 100% that's a complete cluster fuck. Biden is just a puppet and Kamaltoe is a diversity hire. I bet all military orders and command decisions he isn't even aware of. He's gone at this point. Just a dementia ridden, diaper wearing hospital patient and Kamaltoe and Skeletorosi have targets on their backs from the Klinton Krime Kartel because Killary desperately wants to be the first official female president and break the "glass ceiling" as she called it.

[–] [deleted] 3 pts

Sometimes I wonder.

Just go work for the gov't and see for yourself but I don't recommend it. Advanced tech never comes from the gov't but the gov't is pretty good at screwing up technical advances.

[–] 1 pt

not true. the government takes a decade to even design/deploy a satellite. look at how long it took to build the James Webb telescope. by the time it's implemented most of the tech is ancient.

[–] 0 pt

Simply incorrect; while the general names for things may have existed since before the 80s, the methods used to create them and the way they perform are drastically different and constantly being replaced or refined. Things like transistors for instance.

Similarly the algorithms used by the computer are vastly different and improved constantly.

[–] 0 pt

That room of electronics now fits in your phone.

[–] 0 pt

the physics don't change. it's the implementation that has been refined. transistor tech has improved 10k fold..I mean, we literally use lasers to etch computer chips now. that shit was not happening in 1980.

[–] 0 pt

And yet, what the audio engineer in the video is doing is almost EXACTLY the same as what an audio engineer is doing today.

People keep on making the point that refinement has been done. No question, refinement requires tremendous amounts of investment, r&d and creativity. Yet, our computers still do exactly the same things they have done now since about the 50s. They just do more of it. Darpas initial demonstration for the tech for the internet was in 1969. Common Lisp and APL were invented in the 50s and contain virtually all the programming concepts that ended up in all of the languages since then. And, no matter how "refined" and "capable" modern algorithms may be, once compiled to machine code all that fancy math gets executed by the same logic gates that have been in cpus since the 50s. People are still doing word processing and spreadsheets as they always have.

The only new tech that really is new relative to the 50s is wireless. Everything else is refinement of what those people invented.

[–] -1 pt

you're wrong on several points. the logic is certainly not the same since the 50's. if that were true I would be able to take a simple program compiled on a modern processor and execute it on 1950's computer. that's simply impossible.

[–] 1 pt (edited )

This is just an issue of compiler support for said architectures. Add support for a cpu architecture, compile your hello world and you are good to go. The GCC compiler supports 13 architectures. If anyone had Elon sized pocket change, it might be fun to see how far that can be extended, particularly through the most important and interesting cpu architectures through history.

All cpus use the same handful of logic gates to do computation, nothing has changed there.

What about the memory management unit?