WelcomeUser Guide
ToSPrivacyCanary
DonateBugsLicense

©2026 Poal.co

999

Here is a interesting thought. Does this mean that you could probably create things like this with the right model. As in, they have been trained on code that is hidden behind NDA's and such?

Archive: https://archive.today/nyv2V

From the post:

>My old 2016 MacBook Pro has been collecting dust in a cabinet for some time now. The laptop suffers from a “flexgate” problem, and I don’t have any practical use for it. For quite some time, I’ve been thinking about repurposing it as a guinea pig, to play with FreeBSD — an OS that I’d aspired to play with for a long while, but had never had a real reason to. During the recent holiday season, right after FreeBSD 15 release, I’ve finally found time to set the laptop up. Doing that I didn’t plan, or even think, this may turn into a story about AI coding.

Here is a interesting thought. Does this mean that you could probably create things like this with the right model. As in, they have been trained on code that is hidden behind NDA's and such? Archive: https://archive.today/nyv2V From the post: >>My old 2016 MacBook Pro has been collecting dust in a cabinet for some time now. The laptop suffers from a “flexgate” problem, and I don’t have any practical use for it. For quite some time, I’ve been thinking about repurposing it as a guinea pig, to play with FreeBSD — an OS that I’d aspired to play with for a long while, but had never had a real reason to. During the recent holiday season, right after FreeBSD 15 release, I’ve finally found time to set the laptop up. Doing that I didn’t plan, or even think, this may turn into a story about AI coding.
[–] 1 pt

Does this mean that you could probably create things like this with the right model. As in, they have been trained on code that is hidden behind NDA's and such?

None of the publicly available LLMs were trained on proprietary source code. Even Microsoft’s own Copilot was not trained on any of its vast collection of internal, working, production code. They know full well these things will leak their training data—that’s what they are designed to do.