WelcomeUser Guide
ToSPrivacyCanary
DonateBugsLicense

©2025 Poal.co

650

While browsing through I noticed links to articles on garbage tier [msm] websites that I have no interest on clicking but wanted to read. Posters on other platforms are encouraged to archive links, which; preserves the original content of the link and prevents the original source from getting click revenue. I'd like to see that here. Thoughts?

While browsing through I noticed links to articles on garbage tier [msm] websites that I have no interest on clicking but wanted to read. Posters on other platforms are encouraged to archive links, which; preserves the original content of the link and prevents the original source from getting click revenue. I'd like to see that here. Thoughts?

(post is archived)

[–] 4 pts

Agree, but people are going to have different opinions of which sites are worthy of the clicks and some people just aren't going to bother archiving their links.

If you're interested enough to read the article, you can archive it yourself easily enough by copying the link and pasting it into archive.is.

Maybe someone smart and willing to dedicate their time and processing power could make a derram-style bot. But I just ruled myself out on three counts, so it ain't gonna be me.

[–] [deleted] 4 pts

Thanks for your thoughts. I agree that compiling a list of websites worthy/not worthy of clicks may be difficult and there are those who just won't archive anyway. I will archive links as you suggested for my personal review. A derram-style bot would be a great addition. Hopefully someone with the skills can throw one together.

[–] [deleted] 4 pts

Could we possibly add something where users can define "evil domains" in their user accounts so a given submission would get a warning mark by it (kind of like how it's done with non-https links) so it's easier to not click on such links?

[–] 2 pts

Good idea! Please submit it to

[–] [deleted] 2 pts

That's an interesting idea. To each his/her own list. Thanks!

[–] 3 pts

need a bot account to auto archive every post......

It'd be easier to implement if Poal had an api, which I don't believe it does at the moment.

[–] 3 pts

I generally archive for sites that are left leaning cesspools, but contain an interesting article ... or some sites where they're ad happy and hard to read otherwise. Now bear in mind though, archive doesn't work with videos, so if an article has a decent video that should be seen, using archive isn't the answer.

[–] 3 pts

Good point about the video links.

[–] 2 pts

Example, I generally archive livescience.com because it has pop videos, plus ppl have complained ads on it drive them nuts on smartphones ... but if the main theme is a video, I can't, so I just run with video links and slideshows.

[–] [deleted] 2 pts

Thanks for archiving and your points on videos embedded in articles.

[–] 2 pts

Never realized that. Thanks for the post

[–] 1 pt

A good feature for Poal, that would enable it to stand out, is for an auto-archive function to perform that task in the background parallel to every submission. There would be a small icon you could click for the "archived" version of the post, or click the headline for the actual link.