WelcomeUser Guide
ToSPrivacyCanary
DonateBugsLicense

©2025 Poal.co

1.1K

This is a bit of a tech question.

I'm been doing the linux thing for the last 10 years. When I started there were about three different ways to install any thing, all of which worked every time.

You could {package-manager} {install cmd or flags} {package} and install almost anything. You could download it from their website. Or you could get the source and run ./configure; make; make install. And it would always work.

Using make was considered the hard way but it was just three commands universal to every project. Even non-c projects would mirror it as a practice. It work 100% of the time no matter what device you were on.

Now you want to install TensorFlow or something else.

You type pip install tensorflow. It installs. You launch python and run import tensorflow as tf. "Illegal instruction."

Ok so you try the next option. Docker. Two commands listed. Now keep in mind I've seen docker completely fail to install on a device. You might say it was me but I've seen 100% install rate of all late 2000s software without fail. I've also seen other people reference running docker inside of virtual box on their mac before. So this is clearly a thing.

So I do the pull and it goes well which is surprising. I then run docker -it -p8888:8888 tensorflow/tensorflow to see the Jypiterbook. I go to my browser http://localhost:8888. Nothing.

So I decide maybe I'm too different running my Arch Linux.

I go onto a Windows machine. I install python with their web-installer. Go to type in python after it's done. Command not found. Somehow miraculously I figure out that it installed as py.

Ok. Next step. Let's install the tensorflow. pip install tensorflow. I think about how stupid windows is to not have a super user as I type. I mean I install python and I can now install anything with one command. pip not found. Go to the internet for instruction on how to install pip. Pip install pip is the first instruction. It's also the only instruction. I try to dig deeper. It says that for my version of python and any version newer than bread that pip is already installed on my machine it says smugly.

Ok, well what if it isn't? Oh, there's a page for that. Doesn't work.


Ok. Maybe I'm an idiot and can't figure out other people's shit. But in the past we had 0% of these problems. 3/4ths of the code was written in C which is supposed to be the most non-portable. But it ran in 100% of cases on all distros, and architectures including x86, 64bit, arm, mips, and bsd, without a second of doubt it wouldn't work. All with the same repetitive command. Yeah, that c non-portability sure was a curse.

Now you can't install anything without instructions and it never works. Why is it that doing it the hard way was easy in the past and now we have a million easy ways to install software and they are all hard?

This is a bit of a tech question. I'm been doing the linux thing for the last 10 years. When I started there were about three different ways to install any thing, all of which worked every time. You could {package-manager} {install cmd or flags} {package} and install almost anything. You could download it from their website. Or you could get the source and run ./configure; make; make install. And it would always work. Using make was considered the hard way but it was just three commands universal to every project. Even non-c projects would mirror it as a practice. It work 100% of the time no matter what device you were on. Now you want to install TensorFlow or something else. You type pip install tensorflow. It installs. You launch python and run import tensorflow as tf. "Illegal instruction." Ok so you try the next option. Docker. Two commands listed. Now keep in mind I've seen docker completely fail to install on a device. You might say it was me but I've seen 100% install rate of all late 2000s software without fail. I've also seen other people reference running docker inside of virtual box on their mac before. So this is clearly a thing. [Downfall: Hitler uses docker](https://tube.poal.co/watch?v=PivpCKEiQOQ) So I do the pull and it goes well which is surprising. I then run docker -it -p8888:8888 tensorflow/tensorflow to see the Jypiterbook. I go to my browser http://localhost:8888. Nothing. So I decide maybe I'm too different running my Arch Linux. I go onto a Windows machine. I install python with their web-installer. Go to type in python after it's done. Command not found. Somehow miraculously I figure out that it installed as py. Ok. Next step. Let's install the tensorflow. pip install tensorflow. I think about how stupid windows is to not have a super user as I type. I mean I install python and I can now install anything with one command. pip not found. Go to the internet for instruction on how to install pip. Pip install pip is the first instruction. It's also the only instruction. I try to dig deeper. It says that for my version of python and any version newer than bread that pip is already installed on my machine it says smugly. Ok, well what if it isn't? Oh, there's a page for that. Doesn't work. --- Ok. Maybe I'm an idiot and can't figure out other people's shit. But in the past we had 0% of these problems. 3/4ths of the code was written in C which is supposed to be the most non-portable. But it ran in 100% of cases on all distros, and architectures including x86, 64bit, arm, mips, and bsd, without a second of doubt it wouldn't work. All with the same repetitive command. Yeah, that c non-portability sure was a curse. Now you can't install anything without instructions and it never works. Why is it that doing it the hard way was easy in the past and now we have a million easy ways to install software and they are all hard?

(post is archived)

[–] 0 pt

I think all the Pajeet's we've imported are affecting the finished products after decades of getting promoted and now hiring their tribe over best skilled for the job.

You're not the only one who has experienced this.

My experience has generally been that shit almost never installs out of the box.

THere's always some dumbshit extra steps I have to follow to get it to work.

And don't get me started on docker. I don't get everyone's collective orgasm over docker, I've found it obnoxious to work with. Plus docker allows companies to sell proprietary containers which basically means docker will turn into a paid platform if you want to do anything other than use purely-open source code.

I think the problem is that technology is getting more and more specialized. The distros are moving farther apart from each other in terms of related packages and software versions.

There's a lot more competing installation methods out there. I think go has the best idea shipping (basically static) binaries.

This is definitely better than it used to be. Having scattered libraries all over to find and compile. Eventually to packages. While sometimes these things are broken, you can usually open an issue and get help pretty quickly. You do have to learn a little about a lot, but I think the ecosystem is evolving really well. A few steps forward with one back every so often.

[–] 0 pt

I found that the scattered libraries always installed consistently. So yeah, it took a long time if several libraries weren't in the package manager, but you always got that end product.

I tried to install something yesterday, can't remember now but it wouldn't install. The make file would work, did a search and found that the idiots on the site that created the files didn't compile the shit right. I think the problem you are having is that the people are not competent like in the past. I was installing themes for XP and found a great one. The dumbass forgot to make a theme file for the theme and had great art but didn't make the file so the theme was usable. I was fuming that something that is so necessary for any theme even in windows with the theme file was not there. In windows though I was good enough to take another theme file and edit it for the theme I needed it for and it always worked but I don't have the experience enough in Linux to do that YET. Always leaning though so I'll be great by 10 years I imagine.

[–] 0 pt

Right. Those people are inept. The problem is the people I deal with are also inept, but if I bother to tell them they packaged their software poorly I end up looking like the dummy when they say "don't you know how to use yarn"

The whole point of packages is that people don't need to know your shit.

You would think that packaging things for C, a lot of the complexity of the problem would bleed through to the user, but it doesn't. Yet now we have high level languages that let you write the same code in 10 lines that would take thousands of lines in C. Which is really great. But then they have even less excuse to pass on complexity to the user. They are taking the most uncomplicated thing and wrapping it in complexity.

[–] 0 pt

So do me a favor guys. The next project you work on, add a Makefile.

Sorry , I sent you something and it did not have a Makefile. It should have though.

[–] 1 pt

Lol

[–] 0 pt

The best irony of all is that I was the one who convinced phuks to write a dockerfile. I wrote the first one. I just though dockerfiles do document how to install something so they are useful even if you don't use docker.

I almost made your code docker dependent.

[–] 0 pt

They still don't use it, and its still shit. too many changes on the regular basis for it to work well. I Just setup another clone for someone 2 days ago and there was no really good documentation to make it work. There still isn't I'll just have to write it myself again.

[–] 0 pt

I've got an instance of Poal running on a Pi2.

Best install moments in my life.

[–] 0 pt

I'm sure docker would have made it easier. Good thing Poal doesn't have docker as a core dependency. That would make it less portable.