If you have overcommit on, that happens. But if you have it off, it has to assume the worst case, otherwise there can be a failure when someone writes to a page.
The fundamental problem is that your machine is running software from a thousand different projects or libraries just to provide the basic system, and most of them do not handle allocation failure gracefully. If program A allocates too much memory and overcommit is off, that doesn't necessarily mean that A gets an allocation failure. It might also mean that code in library B in background process C gets the failure, and fails in a way that puts the system in a state that's not easily recoverable, and is possibly very different every time it happens.
For cleanly surfacing errors, overcommit=2 is a bad choice. For most servers, it's much better to leave overcommit on, but make the OOM killer always target your primary service/container, using oom-score-adj, and/or memory.oom.group to take out the whole cgroup. This way, you get to cleanly combine your OOM condition handling with the general failure case and can restart everything from a known foundation, instead of trying to soldier on while possibly lacking some piece of support infrastructure that is necessary but usually invisible.
There's also cgroup resource controls to separately govern max memory and swap usage. Thanks to systemd and systemd-run, you can easily apply and adjust them on arbitrary processes. The manpages you want are systemd.resource-control and systemd.exec. I haven't found any other equivalent tools that expose these cgroup features to the extent that systemd does.
I really dislike systemd, and its monolithic mass of over-engineered, all encompassing code. So I have to hang a comment here, showing just how easy this is to manage in a simple startup script. How these features are always exposed.
Taken from a SO post:
# Create a cgroup
mkdir /sys/fs/cgroup/memory/my_cgroup
# Add the process to it
echo $PID > /sys/fs/cgroup/memory/my_cgroup/cgroup.procs
# Set the limit to 40MB
echo $((40 \* 1024 \* 1024)) > /sys/fs/cgroup/memory/my_cgroup/memory.limit_in_bytes
Linux is so beautiful. Unix is. Systemd is like a person with makeup plastered 1" thick all over their face. It detracts, obscures the natural beauty, and is just a lot of work for no reason.
This is a better explanation and fix than others I've seen. There will be differences between desktop and server uses, but misbehaving applications and libraries exist on both.
There are two separate ongoing projects to make a rust compiler that uses GCC as a backend (one on the gcc side adding a c++ frontend that directly reads rust, one on the rustc side to make rustc emit an intermediate format that gcc can ingest).
The long-term solution is for either of those to mature to the point where there is rust support everywhere that gcc supports.
I wonder how good a LLVM backend for these rare architectures would have to be for that to be “good enough” for the kernel team. Obviously correctness should be non-negotiable, but how important is it that the generated e.g. Alpha code is performant for somebody’s hobby?
Please explain how it adds to the discussion about different ways to broaden supported Rust target architectures. Because both have the word Rust in them?
100% agreed. The golden age of gaming is right now, Kickstarter and Steam have opened up the field to smaller studios in a way that has never happened before.
The biggest, most advertised titles are often very good-looking and very "bubblegum", for the exact same reason that the most popular genres of pop are like they are. To appeal to the widest audience, you have to file off all the sharp corners, and if that's the market you see then modern games can seem soulless.
But that's not all of the market! No matter what genre you are interested in, there's probably more work ongoing in it and better games coming out right now that there ever has been in history. Most of them are less refined and sell a lot less than the mainstream games, but occasionally one succeeds well enough to expand past the small niche audience, which inevitably brings a lot more people into the niche, followed by imitators which grow the niche.
I feel like the indie-games are almost as clustered in small areas of potential "game design space" as AAA-games are, but just clustered in different areas, in particular around "games inspired by ha handful of SNES games and early Playstation JRPGs" (and maybe a tiny amount of vague Rogue-like-likeness). If you read much about old games (e.g. [1]) it is obvious that the history of games is full of evolutionary dead-ends and forgotten mainstream games (and entire almost-forgotten mainstream genres).
Yeah, it's hard not to consider the runaway success of games like Stardew Valley as counterexamples to the idea that the creativity is completely gone. But you wouldn't blame someone if they superficially looked at screentshots and thought it was a run of the mill retro pixel game. But it's wild to me that there are people who come from broken homes or rough childhoods who say the game was literally therapy for them and showed them a vision of domestic life or human interaction that they could realistically replicate or at least shoot for in real life.
I'm currently playing a game that is a blatent rip-off of Stardew Valley to the point where I frequently question why they were so obvious. (Or maybe those elements are rip-offs of Harvest Moon, I haven't played Harvest Moon to know.) Still, it's enjoyable. The design elements and places where it does diverge from Stardew Valley make it more enjoyable in my opinion.
As the saying goes, "good artists borrow, great artists steal."
Harvest Moon defines the "Turning round a dilapidated farm in a small village where you give everyone gifts all the time" genre. It all comes from there.
EDIT: Stardew Valley has so many QoL improvements over harvest moon though. The early HM games are punishing.
> I feel like the indie-games are almost as clustered in small areas of potential "game design space" as AAA-games are, but just clustered in different areas, in particular around "games inspired by ha handful of SNES games and early Playstation JRPGs
Huh? That is also an artifact of what kind of games you follow. Just of the top of my head:
- colony sims
- strategy games (tactical/operational/grand-, with rt, rt+pause, turnbased options for each)
- racing games
- 4x games
- flight sims
- spaceflight sims
- rpgs
- survival games
- shmups/ bullet hell
- roguelike/roguelite
- exploration
- rhytm games
- horror
- factory builder / management sim
Note that it is in no way new that humans were using fire back then. The oldest hint of deliberate control of fire are from ~2Myr ago, the oldest evidence that is generally considered conclusive is from ~1Myr ago.
But that's not evidence for fires started by humans, only that people were managing fire. Prior to this find, the oldest evidence for a method of starting fire was from 50kyr ago. Archeologists generally consider this to be an artifact of what gets preserved, not proof of when people started creating fires. There is a lot of natural fire on the landscape over thousands of years, and teasing apart what is evidence for humans starting fire from what is found naturally is really hard.
So this is an amazing find, but it is not paradigm-shifting in any way.
> Fire allowed early populations to survive colder environments, deter predators and cook food. Cooking breaks down toxins in roots and tubers and kills pathogens in meat, improving digestion and releasing more energy to support larger brains.
> Fire also enabled new forms of social life. Evening gatherings around a hearth would have provided time for planning, storytelling and strengthening group relationships, which are behaviors often associated with the development of language and more organized societies.
seems very important to be able to start fire for this stuff. And if we were off by like an order of magnitude it's pretty paradigm shifting.
We are very certain that people had fire. We just have no direct evidence for it, only indirect.
Also, the ability to start fires is not necessary for having fires, there is some evidence of a practice of maintaining slow-burning fires for essentially forever, letting you catch some from a wildfire and then just maintain it.
They did not lie, you are reading things into the text that are not there.
Firstly "Consensus that humans started making fire 50k years ago" is very different from "Until now, the oldest confirmed evidence ... about 50,000 years ago.". All archaeology has to contend with the reality that we have evidence of a very small fraction of the things that happened in the past, and archaeology focusing on the paleolithic doubly so. Absence of evidence is not evidence of absence.
Secondly, use of fire (which is what is needed for the scenes you quoted) and the ability to start a fire are two entirely different technologies that might be separated by a vast gulf of time. We have clear and accepted evidence of use of fire from ~1 million years ago, and fragmentary, contested evidence a million years before that. Prior to this find, we didn't have any evidence of people deliberately starting fires more than 50k years ago. These claims are not in conflict! It is entirely possible that people were maintaining fires they caught out of wildfires, for substantially longer period of time than has elapsed since we learned to start fires. Or alternatively maybe there was a method of starting fires that left no durable evidence.
This is still an amazing find! But it changes a lot less about what we know about or past than a careless read of it might suggest. It certainly does not hint that human evolution was slower than we thought.
The original novel is set in 1625-1628. At that point, firearms are well and truly established, having proven themselves to be the war-winning weapon in the Italian Wars more than a century ago. They are not new and unproven technology; they are the weapon that the great grandparents of the main characters fought and won with.
But they are a symbol of the wrong social class. A musket is something that a peasant or a burgher can use to kill a noble. All the main characters in the three musketeers are nobility, and their social class has suffered greatly from the "democratization" of war. They, like almost everyone like them historically, much prefer the old ways from when they were more pre-eminent, and look down their noses at firearms. They spend very little time at war, and a lot more time duelling and participating in schemes.
The high-tech of the early 17th century wasn't even matchlocks anymore, it was flintlocks. Those took another ~50 or so years to become general issue, but at the time of the novels upper class people who can afford modern weapons wouldn't have been fumbling with matches anymore.
Even though firearms were well and truly established by the 17th Century, blade weapons remained important right on through to the mid-1800s.
Bayonet charges were a major aspect of Napoleonic warfare, and only really went away with the development of firearms that had higher rates of fire and were accurate out to larger ranges. In the Napoleonic era, soldiers would close to within 50-100 meters, fire off a few volleys, and then charge in with the bayonet.
By the time armies were equipped with breech-loading rifles that could fire half a dozen accurate shots a minute at a distance of a few hundred meters, the volume and accuracy of fire made the bayonet charge obsolete. But that was rather late (the 1860s or so).
By the Napoleonic Wars, something below 10% of casualties were caused by melee weapons. And even that was mostly cavalry, bayonets account for ~2%. The purpose of the bayonet charge was not to kill your enemy, it was to convince your weakened enemy to cede his position after you had already done the killing. The forces rarely fought hand-to-hand and when they did it was notable, usually one side was so weakened and shocked that they fled or refused to charge.
Even though most of the casualties were caused by musket and artillery fire, the bayonet was tactically very important in Napoleonic warfare. A bayonet charge is absolutely terrifying, and the reason why there were relatively few casualties from them is likely because soldiers would break rank and flee in the face of one. If soldiers had stood their ground and fought, casualties would have been much higher, and with their low rate of fire, muskets would have been of little use in hand-to-hand combat.
They key change that happened in the mid-1800s is that firearms finally achieved ranges and rates of fire that made closing with a massed enemy nearly impossible.
Also it would be pretty hard for officers to make soldiers do bayonet attack if it weren't known they'd probably face little or no resistance. People tend to value their lives.
I suppose you are right about the history of firearms. However, the novel was written in 1844, more than 200 years after the time in which it is set.
Which makes me wonder if the author (Alexandre Dumas) knew and cared about the historic facts.
Dumas was meticulously accurate, not to the world as it historically existed, but to how the French upper classes felt and wrote about. He was extremely well read in people's memoirs and diaries, and wrote his stories set in the world as the French aristocracy imagined it existed.
I believe he got this detail right in both ways; in that firearms were the most important weapons, and also the main characters would have done their very best to ignore that fact.
One thing that always bothered me was his use of currency. In the French original he mentions at least 5-6 types of currency and it seems they all have common sub-divisions, despite some of them being Spanish or even Italian.
Was France using other people's currrncy back then ?
The nation of France as we know it did not exist at that time and there was no standardized currency among the kingdoms that made up the crown. Livres, sous, and deniers were the standard unit of accounting but each major polity produced their own coinage. Kings also sometimes devalued their currencies to help pay for wars so traders preferred to use more stable currencies like Spanish and Dutch coins (Louis XIII did a major devaluation about a decade after the time period of the book, which colored perceptions of the time).
It was very common before nationalism and the standardization of currencies. I read primary sources about conquistadors and the contracts financing and supplying the expedition might involve a dozen currencies because each trader supplying the wood, food, animals, etc would work in their own preferred/local currency.
One niggle: France was mostly made up of duchies, not kingdoms. The King of France had allegiance from some of the duchies making up modern France, but notably Burgundy was the one who captured Jeanne d'Arc (Joan of Arc) and turned her over to the English - so clearly not all.
Not sure about the Occitan; IIRC Eleanor was considered a queen in her own right as rule of Aquitaine, not a duchess.
I'm no historian, but back then, coins were literally worth their weight in gold (or silver, copper, bronze, whatever), so it was probably easier to pay with foreign currency than we might assume...
It’s more that there was a standard unit of accounting (livres, sous, and deniers) and everyone could convert from one currency to that standard and back to another currency. It moved a lot slower than modern foreign exchange so except for local fluctuations, it was rather predictable.
> were literally worth their weight in gold (or silver, copper, bronze, whatever), so it was probably easier to pay with foreign currency than we might assume
Are you sure you know what the coin paid you is made of? A merchant of the time wasn’t. Those who care not to be scammed have never found it simple.
Experienced traders can make a quick estimate of the purity by rubbing it against a touchstone, which has been used since ancient times. And by treating the rubbings with mineral acids you can make even more accurate determinations, although I'm not sure if this was done in the 1620s.
You are discussing absolute certainty, but in practice a box full of Spanish dubloons was very likely to be a treasure trove, and people generally trusted coinage, even if they had doubts. A filed silver penny still often bought a penny's worth of goods.
Everyone used whatever currency was locally availible, with every merchant in border regions being very aware of conversion rates. Throughout history there was also a cronic shortage of smaller-denomination cash, stuff for normal people to buy normal things. Today, we see "clipped" coins as evidence of forgery when in fact much of that was likely related to a lack of loose change. Nobody in town able to break a gold crown? Well, maybe you buy a horse with a slice of gold from that crown.
Clipping and dissecting a coin into smaller pieces for down-conversion are very different things. The piece of eight wasn't haphazardly cut, but instead pre-indented for breaking cleanly.
If you want to buy something worth 6% of a gold coin, whacking off an edge of one is a weird way to do it. You'd need a scale handy.
above all, I believe he cared about getting the next draft ready for each week / before deadline, and then about keeping the cliffhanger suspense high, to keep his fish on their hooks.
Alternatively, latecomers will make use of newly cheaply available compute (from the firesales of failing companies) to produce models that match their quality, while having to invest only a fraction of what the first wave had to, allowing them to push the price below the cost floor of the first wave and making them go under.
Apple really doesn't like to own manufacturing. They want to be in a position of a favored key customer, but still have the ability to switch vendors at a moment's notice if they can get a better deal.
reply