Probably a nuanced point in what's the purpose for espousing the virtues of performance if you don't have the output to show it is worth it?
If you want advice about making games would you rather learn from the person that routinely ships games or a person that shipped a game once 10 years ago?
Is that a trade off worth chasing? "Potential perfection" with nothing to show for it?
More like, shipped 2 hit games, which were both technological and artistic feats for their time. And developed a blazingly fast compiler. Casey also was a developer in RAD game tools developing animation tools. Their output is probably better than most industry developers. I understand if you don't like their attitudes and the way they attempt to teach/preach to other engineers, but IMO their work speaks for itself. I take their advice and try to apply it to my own work, because it seems to have work for them.
I'm not saying I don't like their attitudes but it's a viewpoint I am struggling with myself.
I'm starting to realize caring about all these minutia of details that don't really matter for my professional goals. I know my software isn't special, caring about pumping out as much performance as possible when I just sling JS professionally feels a tad myopic?
What is the point of it just continues the pattern of procrastination towards the actual goals I want to achieve? Does this also apply to them?
What is the point of espousing all these supposed virtues when the output isn't that special? I mean Braid is still good, but let's not act like greener devs haven't put out good games too without all the jackassery baggage.
Yea I largely agree with you on that point. I think when discussing Jon, Casey (and to add another, Mike Acton), there's actually a series of advice that they give that get lumped into a whole, and people don't really see the parts of what they're saying and instead focus on the part that sounds most critical to their work.
I do agree that if you take from their "teachings" that every dev needs to optimize every thing, and never use any other language than system languages, that advice is myopic for most devs. However, I don't really see them arguing for that, at least not entirely.
From following their teaching for a while, they mostly preech about the following things which I agree with, even when talking about higher-level languages, technologies.
- Get the clowns out of the car: Don't make things needlessly expensive. Write simple procedural code that maps cleanly to what the hardware is doing. This is essentially stating OOP, large message passing, and other paradigms that abstract the problem away from the simple computations that are happening on your computer is actually adding complexity that isn't needed. This isn't about tuning your program to get the highest amount of performance, but rather, just write basic code, that is easy to follow and debug, that acts as a data-pipeline as much as possible. Using simple constructs to do the things you want, e.g. an if-statement versus inheritence for dynamic dispatch.
- Understand your problem domain, including the hardware, so you can reason about it. Don't abstract away the hardware your code is actually running on too much where you lose vital information on how to make it work well. I've seen this many times in my professional career, where devs don't know what hardware the code will be running on, and this inevitably makes their code slower, less responsive to the user and often drives up cost. There are many times in my early career (backend engineering), that just simplifying the code, designing the code so it works well for the hardware we expect, greatly lowered cost. The hardware is the platform and it shouldn't be ignored. Similarly, limitations that are imposed by your solution should be documented and understood. If you don't expect a TPS greater than some value, write that down, check for it, profile and make sure you know what your specturm of hardware can handle, and how much software utilization of that hardware you're getting.
- Focus on writing code, and don't get bogged down my fad methodologies (TDD, OPP, etc). Writing simple code, understanding the problem more deeply as you write, and not placing artifical constraints on yourself.
Now each of these points can be debated, but their harder to argue against IMO then the strawmany idea of them proposing that you must optimize as much as possible. And they argue that you will actually be more productive this way, and produce better software.
FWIW, you may have some datapoints showing that they do propose what I called a strawmany version of their ideas, but I have seen them advocating for the above points more so than anything else.
---
I do want to add, for Jon Blow, I don't think he has a problem with people using engines. From what I've seen he's played, and loved games that used engines in the past, and had no problem with their output in terms of gameplay or performance. From his talk about civilization ending relating to game dev, he's more concern that if no one tries to develop without an engine, we as a civilization will lose that ability.
Yes, this is well put. I was heavily influenced by Casey back in 2014 and the advice I give juniors now is always that first point about "getting the clowns out of the car". There's a lot of crossover with the "grug brained developer" here too, which is much more aligned with the web/enterprise world.
I find it very hard to convince people though. It runs counter to a lot of other practices in the industry, and the resulting code seems less sophisticated than an abstraction pile.
Aha! I think I know my contention with this advice now. Who can actually disagree with this? Like I'm saying yes to everything, no one I know would say no to this. Never had a coworker say aloud: "I want to write code to make things worse."
These are the platitudes of our industry that no one disagrees with. Like you said, this is what Blow + Muratori teachings can be distilled into. But there is something worse it also assumes, coming from such people.
Both Blow + Muratori have extremely privilege dev careers that a good ~80% us will never achieve: they have autonomy. The rest of us are merely serfs in someone's fiefdom. Blow has his fief, Muratori his. They can control their fiefs but not the majority of us devs. We don't have autonomy in the direction of the company, we don't control the budgets, we don't even control who we interview or hire.
Assuming that this onus of organizational standards has to be placed on someone with no authority to dictate it isn't just. Also as someone who has consumed content from these two for about a good 8ish years as their videos pop into my feed: I've never see them advocate for workers to be empowered to make their environments better. They mostly just trash on devs that have no authority.
With that mindset I need to seriously decouple myself from these people. Plenty of other devs advocate for the same things in our craft while also advocating for better rights as workers.
> I don't think he has a problem with people using engines. From what I've seen he's played, and loved games that used engines in the past
He's also said quite openly that if you're only starting, it's fine if you reach for a ready-made engine. It's that you should try and understand how things and systems work as you progress.
No because programmers aren't the ones pushing the wares, it's business magnates and sales people. The two core groups software developers should never trust.
Maybe if this LLM craze was being pushed by democratic groups where citizens are allowed to state their objections to such system, where such objections are taken seriously, but what we currently have are business magnates that just want to get richer with no democratic controls.
This seems like an overly reductive worldview. Do you really think there isn't genuine interest in LLM tools among developers? I absolutely agree there are people pushing AI in places where it is unneeded, but I have not found software development to be one of those areas. There are lots of people experimenting and hacking with LLMs because of genuine interest and perceived value.
At my company, there is absolutely no mandate for use of AI tooling, but we have a very large number of engineers who are using AI tools enthusiastically simply because they want to. In my anecdotal experience those who do tend to be much better engineers than the ones who are most skeptical or anti-AI (though its very hard to separate how much of this is the AI tooling, and how much is that naturally curious engineers looking for new ways to improve inevitably become better engineers who don't).
The broader point is, I think you are limiting yourself when you immediately reduce AI to snake oil being sold by "business magnates". There is surely a lot of hype that will die out eventually, but there is also a lot of potential there that you guarantee you will miss out on when you dismiss it out of hand.
I use AI every day and run my own local models, that has nothing to do with seeing sales people acting like sales people or conmen being con artists.
Also add in the fact that big tech has been extremely damaging to western society for the last 20 years, there's really little reason to trust them. Especially since we see how they treat those with different opinions than them (trying to force them out of power, ostracize them publicly, or in some cases straight up poisoning people + giving them cancer).
Not really hard to see how people can be against such actions? Well buckle up bro, come post 2028 expect a massive crackdown and regulations against big tech. It's been boiling for quite a while and there's trillions of dollars to plunder for the public's benefit.
> No because programmers aren't the ones pushing the wares, it's business magnates and sales people.
This is not correct, plenty of programmers are seeing value in these systems and use them regularly. I'm not really sure what's undemocratic about what's going on, but that seems beside the point, we're presumably mostly programmers here talking about the technical merits and downsides of an emerging tech.
These people are working on destroying the planet to make more money, they absolutely do not care. Our society isn't set up to punish them, but encourage such behavior to even more extremes (see datacenter build outs causing water shortages, electricity hikes, and cancer in poor communities; nearly every politician capitulating on such actions because they don't know better).
I wish people would get off the "AI is the worst thing for the environment" bandwagon. AI and data centers as a whole aren't even in the top 100 emitters of pollution and never will be.
If you want to complain about tech companies ruining the environment, look towards policies that force people to come into the office. Pointless commutes are far, far worse for the environment than all data centers combined.
Complaining about the environmental impact of AI is like plastic manufacturers putting recycling labels on plastic that is inherently not recycleable and making it seem like plastic pollution is every day people's fault for not recycling enough.
AI's impact on the environment is so tiny it's comparable to a rounding error when held up against the output of say, global shipping or air travel.
Why don't people get this upset at airport expansions? They're vastly worse.
The answer to that is simple: They hate AI and the environment angle is just an excuse, much like their concern over AI art. Human psychology is such that many of these people actually believe the excuse too.
It helps when you put yourself in the shoes of people like that and ask yourself, if I find out tomorrow that the evidence that AI is actually good for the environment is stronger, will I believe it? Will it even matter for my opposition to AI? The answer is no.
You don't know that. I don't know about you (and whatever you wrote possibly tells more about yourself than anyone else), but I prefer my positions strong and based on reality, not based on lies (to myself included).
And the environment is far from being the only concern.
You are attacking a straw man. For you, being against GenAI, simply because it happens to be against your beliefs, is necessarily irrational. Please don't do this.
> I prefer my positions strong and based on reality, not based on lies (to myself included).
Then you would be the exception, not the rule.
And if you find yourself attached to any ideology, then you are also wrong about yourself. Subscribing to any ideology is by definition lying to yourself.
Being able to place yourself into the shoes of others is something evolution spent 1000s of generations hardwiring into us, I'm very confident in my reading of the situation.
> Having beliefs, principles or values is not lying to oneself.
The lie is that you adopted "beliefs, principles or values" which cannot ever serve your interests, you have subsumed yourself into something that cannot ever reciprocate. Ideology by definition even alters your perceived interests, a more potent subversion cannot be had (up to now, with potential involuntary neural interfaces on the horizon).
> Citation needed
I will not be providing one, but that you believe one is required is telling. There is no further point to this discussion.
People are allowed to reject whatever they want, I'm sorry that democracy is failing you to make slightly more money while the rest of society suffers.
I'm glad people are grabbing the reigns of power back from some of the most evil people on the planet.
Of course they aren't polluters as in generating some kind of smoke themselves. But they do consume megawatts upon megawatts of power that has to be generated somewhere. Not often you have the luxury of building near nuclear power plant. And in the end you're still releasing those megawatts as heat into the atmosphere.
Egghead.io is not worth it, the courses there have a shelf life even shorter than frontendmasters. Authors mostly use it to dump their wares then never update the course afterwards while breaking API changes litter their backlog making most content, unless it was released in the last 3 months, worthless.
Absolutely not worth it since the courses are on par with random youtube tutorials IMO.
Also really dislike the pattern of some popular frontend frameworks selling basic documentation in the form of "courses."
Thanks for replying. Agreed on shelf life, but IME at least some of the egghead materials, at the time of publication, have been worthwhile to me. But that experience (5+ years ago) is quite possibly out of date.
I'm sure if Codeberg had equivalent resources they'd be good, hard to fault a nonprofit for not benefiting from a trillion dollar multinational corporation. What was GitHub's excuse for their failures?
> Perfectly good excuse to make society worse for people
What an incredibly silly accusation to make of a company/service that streams movies and television. Like you understand it is possible to dilute the concept of civic responsibility right?
Companies don't care about society, unless it affects profit. Companies are not people, they are cold machines that through different means try to reach the same purpose, make more money.
No one should anthropomorphize companies. They might look like they have human qualities, same way like the T800 in the Terminator looked human.
Well looked at what has always happened in society when young people have no hope for the future: massive societal disruption mostly in the forms of revolution + violence.
Since this isn't the 1800s anymore there won't be any major revolutions but I expect way more societal violence going forward. If you have no hope for the future it's not hard to go to very dark paths quickly, usually through no fault of your own sadly.
Now add how easy it is for malicious actors to get an audience and how LLM tech makes this even easier to do. Nice recipe for a powder keg.
It is absolutely the the role of government to regulate commerce and establish competitive markets (note the lack of the word free here).
I also have zero faith in tech leadership as they have been the major driver of mass misery across humanity. Not only should they be stripped of their positions in their companies, but leadership should be directly given to the workers.
It's the only way to right to the wrong. If it's good enough for executives (voting for other executives, pay packages, and company direction), it's also good enough for workers.
How did iRobot hurt anyone? It seems like Warren hurt their workers by denying them the opportunity to keep their jobs. Moreover the whole home robotics industry no resides in China where the companies are run in an even more authoritarian fashion.
This looks like a way to force a few key players to gobble up all the federal dollars by forcing many executive controlled agencies to be force fed these LLM solutions because these same key players cannot sell their wares to the public so they need to steal public money, once again.
The difficulty for any party to want to govern after this is... there is no government. It is all oligarch captured, the candidates are oligarch sponsored, and don't count on the media to sound the siren because, well, you know why.
This is a plane that is never gonna fly again. The only way is to build a new plane, as impossible as that might sound.
https://nvd.nist.gov/vuln/detail/CVE-2025-29927
That plus the most recent react one, and you have a culture that does not care for their customers but rather chasing fads to help greedy careers.
reply