

That’s quite a strong table, holding 11 people
That’s quite a strong table, holding 11 people
Love Platinum, they’re the only fountain pens I can even consider using, as they’re the only ones I found that can handle my low writing volume without drying out between uses.
Eh, I have audio interfaces and MIDI controllers on 10ft cables cause shorter just don’t reach my PC, works perfectly fine. Longer than that is a gamble but as far as I know 10ft is the upper bound of the USB 3.0 spec, so should be totally fine unless you have especially shitty cables.
58% goes to fundraising, administrative and technological costs. The rest has some money going towards, but no limited to, other programs.
Only thing I can find in their financials that would maybe qualify as “random outreach” would be “awards and grants”, at 26mil last year out of 185mil revenue, or 14%.
https://meta.m.wikimedia.org/wiki/Grants:Programs/Wikimedia_Community_Fund
As far as I can tell, it’s not particularly random.
Maybe I’m missing something?
Eh, I’m about the same age as OP, I don’t have to get to 50 to know that I’d take my parents’ economic context over the two crashes. The rest… For many reasons, if medicine does some miraculous leap forward by then, maybe I’ll still wish I got a lot more left to go by then.
Really bigger updates obviously require a major version bump to signify to users that there is potential stability or breakage issues expected.
If your software is following semver, not necessarily. It only requires a major version bump if a change is breaking backwards compatibility. You can have very big minor releases and tiny major releases.
there was more time for people to run pre-release versions if they are adventurous and thus there is better testing
Again, by experience, this is assuming a lot.
From experience shipping releases, “bigger updates” and “more tested” are more or less antithetical. The testing surface area tends to grow exponentially with the amount of features you ship with a given release, to the point I tend to see small, regular releases, as a better sign of stability.
I’d love to share your optimism, especially regarding that last sentence. As long as Google controls the most popular web browser out there, I don’t see the arms race ever stopping, they’ll just come up with something else. It wouldn’t be the first time they push towards something nobody asked for that only benefits themselves.
That’s not “self hosting” related tho lol
It desperately needs interface types if we ever hope to make it a serious contender for general purpose web development. The IO overhead of having to interface with JS to use any web API is itself pretty slow, and is limiting a lot of usecases.
Considering the community we are on, I assumed the criticism was more about the privacy problems surrounding the engine and browser security model than the quality of the language itself. If that was the intent, I mean… Yeah, its weak typing is a fucking mess.
The stuff like Flash, Java applets and Silverlight it eventually replaced were arguably even worse. There’s a legitimate need to run client-side code at times, IMHO the mistake was making it so permissive by default. Blaming the language for the bad browser security model is kind of throwing away the baby with the bathwater.
Is it solid wood or engineered? Some very soft variety of wood? 17 years is extremely short…
As of 2021, the US spent 16.6% of its gross GDP ($23.59 billions) on healthcare expenditures. The very next was Germany, at 12.7% of its $4.28 billion GDP. The US is spending more per-capita than any other OECD country on healthcare, it’s just not made visible by looking at the number on your tax report. You’re still collectively paying for it one way or another.
But hey, yay, low taxes. Good for you, I guess?
Considering how little we actually know, how much we are still figuring out today, how wrong we once were, and most definitely still are on many things, about said nature, the naturalistic argument is IMHO rather weak. The argument silently assumes too many things, at least with our current knowledge - that human beings do actually have an inherent nature, that said nature is uniform enough across the whole species to make that generalization, that said nature is inevitable and can’t be evolved past or rationalized against, that it always was the case and will always be, etc.
Ah, well that’s what almost always ends up happening, doesn’t it… The only thing that legitimately trickles down in this fucking system is costs to consumers lol
I’m not saying the middle ground doesn’t exist, but that said middle ground visibly doesn’t cause enough damage to businesses’ bottom line, leading to companies having zero incentive to “fix” it. It just becomes part of the cost of doing business. I sure as hell won’t blame programmers for business decisions.
I’m not sure if you’re agreeing or trying to disprove my previous comment - IMHO, we are saying the exact same thing. As long as those stranded travelers or data breaches cost less than the missed business from not getting the product out in the first place, from a purely financial point of view, it makes no sense to withhold the product’s release.
Let’s be real here, most developers are not working on airport ticketing systems or handling millions of users’ private data, and the cost of those systems failing isn’t nearly as dramatic. Those rigid procedures civil engineers have to follow come from somewhere, and it’s usually not from any individual engineer’s good will, but from regulations and procedures written from the blood of previous failures. If companies really had to feel the cost of data breaches, I’d be willing to wager we’d suddenly see a lot more traction over good development practices.
Faux-wood was just everywhere at the time indeed. It was a mix of the materials getting affordable and a certain commodification of the hippie aesthetic. Electronics were more perceived as appliances back then, and it was a common trend to make appliances less sterile by adding “natural” materials such as wood.