WelcomeUser Guide
ToSPrivacyCanary
DonateBugsLicense

©2025 Poal.co

1.2K

This is a bit of a tech question.

I'm been doing the linux thing for the last 10 years. When I started there were about three different ways to install any thing, all of which worked every time.

You could {package-manager} {install cmd or flags} {package} and install almost anything. You could download it from their website. Or you could get the source and run ./configure; make; make install. And it would always work.

Using make was considered the hard way but it was just three commands universal to every project. Even non-c projects would mirror it as a practice. It work 100% of the time no matter what device you were on.

Now you want to install TensorFlow or something else.

You type pip install tensorflow. It installs. You launch python and run import tensorflow as tf. "Illegal instruction."

Ok so you try the next option. Docker. Two commands listed. Now keep in mind I've seen docker completely fail to install on a device. You might say it was me but I've seen 100% install rate of all late 2000s software without fail. I've also seen other people reference running docker inside of virtual box on their mac before. So this is clearly a thing.

So I do the pull and it goes well which is surprising. I then run docker -it -p8888:8888 tensorflow/tensorflow to see the Jypiterbook. I go to my browser http://localhost:8888. Nothing.

So I decide maybe I'm too different running my Arch Linux.

I go onto a Windows machine. I install python with their web-installer. Go to type in python after it's done. Command not found. Somehow miraculously I figure out that it installed as py.

Ok. Next step. Let's install the tensorflow. pip install tensorflow. I think about how stupid windows is to not have a super user as I type. I mean I install python and I can now install anything with one command. pip not found. Go to the internet for instruction on how to install pip. Pip install pip is the first instruction. It's also the only instruction. I try to dig deeper. It says that for my version of python and any version newer than bread that pip is already installed on my machine it says smugly.

Ok, well what if it isn't? Oh, there's a page for that. Doesn't work.


Ok. Maybe I'm an idiot and can't figure out other people's shit. But in the past we had 0% of these problems. 3/4ths of the code was written in C which is supposed to be the most non-portable. But it ran in 100% of cases on all distros, and architectures including x86, 64bit, arm, mips, and bsd, without a second of doubt it wouldn't work. All with the same repetitive command. Yeah, that c non-portability sure was a curse.

Now you can't install anything without instructions and it never works. Why is it that doing it the hard way was easy in the past and now we have a million easy ways to install software and they are all hard?

This is a bit of a tech question. I'm been doing the linux thing for the last 10 years. When I started there were about three different ways to install any thing, all of which worked every time. You could {package-manager} {install cmd or flags} {package} and install almost anything. You could download it from their website. Or you could get the source and run ./configure; make; make install. And it would always work. Using make was considered the hard way but it was just three commands universal to every project. Even non-c projects would mirror it as a practice. It work 100% of the time no matter what device you were on. Now you want to install TensorFlow or something else. You type pip install tensorflow. It installs. You launch python and run import tensorflow as tf. "Illegal instruction." Ok so you try the next option. Docker. Two commands listed. Now keep in mind I've seen docker completely fail to install on a device. You might say it was me but I've seen 100% install rate of all late 2000s software without fail. I've also seen other people reference running docker inside of virtual box on their mac before. So this is clearly a thing. [Downfall: Hitler uses docker](https://tube.poal.co/watch?v=PivpCKEiQOQ) So I do the pull and it goes well which is surprising. I then run docker -it -p8888:8888 tensorflow/tensorflow to see the Jypiterbook. I go to my browser http://localhost:8888. Nothing. So I decide maybe I'm too different running my Arch Linux. I go onto a Windows machine. I install python with their web-installer. Go to type in python after it's done. Command not found. Somehow miraculously I figure out that it installed as py. Ok. Next step. Let's install the tensorflow. pip install tensorflow. I think about how stupid windows is to not have a super user as I type. I mean I install python and I can now install anything with one command. pip not found. Go to the internet for instruction on how to install pip. Pip install pip is the first instruction. It's also the only instruction. I try to dig deeper. It says that for my version of python and any version newer than bread that pip is already installed on my machine it says smugly. Ok, well what if it isn't? Oh, there's a page for that. Doesn't work. --- Ok. Maybe I'm an idiot and can't figure out other people's shit. But in the past we had 0% of these problems. 3/4ths of the code was written in C which is supposed to be the most non-portable. But it ran in 100% of cases on all distros, and architectures including x86, 64bit, arm, mips, and bsd, without a second of doubt it wouldn't work. All with the same repetitive command. Yeah, that c non-portability sure was a curse. Now you can't install anything without instructions and it never works. Why is it that doing it the hard way was easy in the past and now we have a million easy ways to install software and they are all hard?

(post is archived)

[–] 0 pt

Right. Those people are inept. The problem is the people I deal with are also inept, but if I bother to tell them they packaged their software poorly I end up looking like the dummy when they say "don't you know how to use yarn"

The whole point of packages is that people don't need to know your shit.

You would think that packaging things for C, a lot of the complexity of the problem would bleed through to the user, but it doesn't. Yet now we have high level languages that let you write the same code in 10 lines that would take thousands of lines in C. Which is really great. But then they have even less excuse to pass on complexity to the user. They are taking the most uncomplicated thing and wrapping it in complexity.