The Monroe Doctrine was about preventing colonial powers from enacting NEW efforts to reach into the Americas, not about getting rid of previous control.
"The occasion has been judged proper for asserting, as a principle in which the rights and interests of the United States are involved, that the American continents, by the free and independent condition which they have assumed and maintain, are henceforth not to be considered as subjects FOR FUTURE COLONIZATION by any European powers." (emphasis mine)
Yeah, you can visit the EU by… sailing a ways Northeast(ish) from Maine, until you’re just south of (a part of) Canada. And by going to the Caribbean. And South America.
I've got a pure Go journald file writer that works to some extent—it doesn't split, compress, etc, but it produces journal files that journalctl/sdjournal can read, concurrently. Only stress tested by running a bunch of parallel integration tests, will most likely not maintain it seriously, total newbie garbage, etc, but may be of interest to someone. I haven't really seen any other working journald file writers.
Is it solved in macOS? Curl recently removed macOS keychain support as there are like 7 competing APIs 6 of which are deprecated and number 6 is a complete HTTP replacement so curl can't use it.
Only reason why it works on macOS curl is because they're a few versions behind
Complains about TLS inspection, yet fronts their website on the biggest and most widely deployed TLS introspection middle box in the world ...
Why do we all disdain local TLS inspection software yet half the Internet terminates their TLS connection at Cloudflare who are most likely giving direct access to US Intelligence?
It's so much worse as it's infringing on the privacy and security of billions of innocent people whilst inspection software only hurts some annoying enterprise folks.
I wish we all hopped off the Cloudflare bandwagon.
Three of the banks I use have their websites/apps go through CloudFlare. So does the electronic records and messaging system used by my doctor. A lawyer friend uses a secure documents transfer service that is protect by guess who.
Who needs to let CF directly onto their network when they already sit between client and provider for critically-private, privileged communications and records access?
> Putting Cloudflare anti-DDoS in front of your website is not the same as breaking all encryption on your internal networks.
You misunderstood, they're complaining about it as a user. If your website uses Cloudflare then our conversation gets terminated by Cloudflare, so they get to see our unencrypted traffic and share it with whomever they want, compromising my privacy.
Which wouldn't be such a problem if it was just an odd website here or there, but Cloudflare is now essentially a TLS middle box for the entire internet with most of the problems that the article complains about, while behind hosted behind Cloudflare.
Given that 50-70% of the critical services I use in my daily life (healthcare, government, banking, insurance) all go through Cloudflare this practically means everything that is important to me as an individual is being actively intercepted by a US entity that falls under NSA's control.
So for all intents and purposes it's equivalent.
My point is: it's very hypocritical that we as industry professionals are complaining about poor cooperates being MITM'd whilst we're perfectly fine enabling the enfringement of fundamental human right to privacy of billions of people by all fronting the shit that we build by Cloudflare in the name of "security".
I find the lack of ethical compass in this regard very disturbing personally
Having an organization install custom root certificates onto your work or personal computer and hosting a public blog on Cloudflare are two entirely different topics.
That your healthcare, government, bank, etc. are using Cloudflare, is a third. In an ideal world I guess I'd agree with you, but asking any of these institutions to deploy proper DDoS protection may just be too much of an ask.
Do you have an alternative, potentially one that's less centralised or private or in bed with three-letter agencies? I ask because my last infra was probed for vulnerabilities hundreds of times per day; putting Cloudflare in front with some blocked countries and their captchas brought the attempted attacks down to a few dozen per month.
I think it’s misleading to imply hypocrisy considering the reasons listed in the article don’t apply to the scenario of a site being behind Cloudflare.
Tailscale connections don't get terminated by a middle box, it's just end-to-end encrypted Wireguard under the hood. Cloud-hosted control panel is a risk because they could push malicious configuration changes to your clients (ACLs and new nodes if you're not using the lock feature), but they can't do it without leaving a trace like Cloudflare can.
Previous version was in bash. With this change you can build a nixos image not containing bash or any shell whatsoever.
Not having interpreted languages on the system at all is an effective hardening technique combined with verity store containing all your executables as it makes it impossible for attackers to add new executable files to the system which stops almost all attack vectors.
I'm glad to see boot security prioritisation, and to see some of the fundamentals revisited, and scripts replaced with languages that contributors want to write in (NixOS leans heavy towards Rust).
As the project doc notes:
> This radical solution is only really feasible and/or interesting for appliances (i.e. non-interactive) systems.
Can you explain a bit more about this? Is the idea that verity protects the integrity of the nix store, and then the boot process only runs binaries that don't expose any sort of arbitrary code functionality?
> makes it impossible for attackers to add new executable files to the system which stops almost all attack vectors
If you have code execution - any kind - you have code execution. It really doesn't matter if a shell is available or not, you're always an open(2), write(2), and execve(2) away from creating and invoking a new executable, or just mmap(2)ing a new executable region in the current process. Yes, most exploits leverage a shell because it's convenient, so you're making it a little bit more annoying by having to first write an executable, but it really doesn't stop attacks like this.
Much more effective measures are those that prevent program takeover in the first place (SSP, ASLR), and things like W^X.
It's just incomplete and very early days for landlock.
Landlock requires you to commit upfront to what is "deny-default"ed but they only added a control for TCP socket bind and nothing else. So you can "default-deny" tcp bind but all the other socket paths in the kernel are not guarded by landlock. It tries really hard to have the commit of features be an integral part of the landlock API so that you can have an application able to run on multiple kernel versions that support different parts of the landlock spec. But that means that as they develop the API the older versions of landlock need to be less restrictive than newer versions otherwise programs dont work across kernel versions.
That way, a program that is very restrictive on say kernel 6.30 can also run on kernel 6.1 with less restrictions. The program keeps functioning the same way (never break userspace). The only way to do that is to have the developer tell what parts need to be restricted explicitly and you can't restrict what isn't implemented yet.
There is nothing to do here. Landlock already a guarantees that you can't undo rules that were already applied. Your application can further restrict itself but it can't unrestrict itself.
I studied at Utrecht University and all the programming classes in the Bachelor were C#, Visual Studio, XNA, DirectX. Windows. Database class i had to learn in Proprietary Microsoft tools too. All Microsoft stuff. Sure nobody would complain if you did stuff on Linux but all the support by TAs and teachers was on Microsoft platforms only.. The Master was much better but the Bachelor basically was grooming people to become Microsoft consultants.
If the rot starts at the core of your education curriculum there is no saving your dependence on Microsoft.
I always found this choice puzzling to teach people proprietary technologies in a public institution. This was before DotNet core and VSCode was a thing and Microsoft hadnt whitewashed themselves to look like an open source friendly brand yet.
And same goes for less technical disciplines too. Adobe, Autodesk, Archicad, etc. It's pretty bad software: expensive, very buggy, poor extensibility, poorly maintained, closed-source, rapid tech debt accumulation requires upgrading your pc every few years. If only a minor percentage of organizations licensing it would instead spend that budget financing an open source project, that would have a very positive effect for everyone. I can somewhat understand private businesses not thinking long-term, but public institutions paying licensing fees instead of financing open-source seems like plain incompetence. Then again, maybe there's a lack of open-source initiatives willing to spearhead this.
But if students learn some open-source software that doesn’t get used in private industry, will they be able to land a job that’s asking Autodesk et al. knowledge as a requirement?
As a former medical and scientific illustrator, learning a software package (Photoshop or GIMP) really isn't as crucial as learning principles and practices of art, design, and graphics. Color theory, negative space, composition, etc., are critical to production and apply to any media one chooses to work in: oil on canvas, pen and ink, or computers.
The other issue is access. Again, from an art/graphics/design perspective, costs associated with proprietary software can limit some students from even participating in art/graphics/design programs. Adobe Creative Suite is US$69.99/mo or US$840/yr.
It's not the job of a university to prepare you for the workplace. That's the job of the workplace. I'm sick of industry outsourcing their jobs to public institutions.
It's the job of a university to teach cutting edge research
Sure, you can say that. But a good chunk of people will disagree with you. I went to one of the top schools, and it was fairly 30/70 between teaching “cutting edge research” and teaching “what’s being used in practice”. I think that was fair. During bachelors, hardly you’ll get cutting edge research cause you don’t have prerequisite knowledge.
Agree with this 100%. At some point the private sector decided that it will accept no responsibilities of any kind (except for what was fought and defended tooth and nail by the civil society and a few slightly more responsible governments), and all the costs that can be avoided will be avoided, shifting the burden on the public sector.
It's not a big jump to go from open-source equivalents of the close source products. The concept and what one wants to accomplish is the same. Many of these companies have certification programs, if the point is to be specific and narrow, for a particular job.
Good points. Funding the open-source equivalents, even at a fraction of what they are spending on close-source, would have circumvented the problem of being "trapped" in the first place. Even more, the universities would be able to contribute code to the projects, if they wanted to.
It was always pretty obvious what Microsoft wanted and was trying to do. Now trying to escape, will be painful, but that's the price they will have to endure if they want freedom and data sovereignty.
I don't think it's a stretch. I've numerous close friends that work with it daily and I've helped troubleshoot some of the issues. After Effects is quite hated among them, but has to be used, because there aren't viable alternatives. Illustrator crashes randomly. Photoshop has multi-decade bugs in color handling. But, the fact that their resource use baloons yearly and thus forces the industry to waste on constant hardware upgrades would be enough to discredit Adobe software, imo.
Well I guess at the time large part of GHC development technically was Microsoft Research ;) . But yeh the Functional Programming and Compilers course were nice exceptions to the Microsoft trend. That's also why I ended up following that path in my master's programme :')
Visual studio code sucks badly, just most common developers started with it and are used to it in the same way that Windows was "the os" for the same kind of developers at a specific point in the past.
It is even worse know that vscode and all the clones are packed with llm agents that such devs can't live without.
For one thing for example, the latency of the editor is crazy for someone that worked with native editors.
It did slowly sneak in over time I guess. In my last year of my master's eventually the faculty was forced to stop hosting its own intranet and mailing lists and migrate everything to the "cloud" (Microsoft 365 and Blackboard).
I have a copy at home of all the old wiki content and the old cs.uu.nl website. The university themselves didn't even think they should archive it so I archived it myself.
I hope there's other people with copies too. My archive isn't complete
> if the rot starts at the core of your education curriculum there is no saving your dependence on Microsoft.
TBF, the curriculum being MS based can mean very little if the concepts taught in are valuable enough. I've briefly looked at the project linked in your user description, and they don't look nice and absolutely not tinted by MS influence.
It is indeed dancing with the devil, but if MS forks the money to renew the whole university's computer park, clear all the licensing issues and train part of the staff, it can be a boon for the university.
My uni had a deal with Sun (RIP), many basic courses were in Java, all our system programming course we're Solaris targeted, all servers were Solaris anyway so our code had to run there. It's a pretty basic arrangement IMHO.
reply