Probably off-by-one errors
Find me on Mastodon, if you want.
Probably off-by-one errors
Holy shit, Math Blaster 9-12. You just threw me back SO far. I just had a vivid image in my mind’s eye of the home office I played it in.
Thank you for the throwback.
implying that any developer actually reads warnings
Same here, to a certain extent.
I was referring only to Linux’s lack of bullshittery in comparison to Windows, nothing else.
Far easier to do too. I did one of each last month and there’s no question that the Windows setup experience is terrible in comparison.
I want to like Forgejo but the name is really terrible.
Is it “forj-joe”? Nah, that double-J sound is way too awkward.
Do you then merge the J sounds to make “forjo”? If so, why not just call it that?
Is it maybe “for-geh-joe”? That seems the most likely to me, but then that ignores the “build < forge” marketing on their website.
I know it’s pretty inconsequential, but it feels weird using a tool that you don’t even know how to pronounce the name of.
Seems like a “haha JS bad” kind of joke, but OP seems to forget that Python is also in a similar boat.
You at least have to know that it’s a meme format. Otherwise it just looks like someone complaining about async with a bad crop.
Interestingly, this JXL loads in Boost, but the one in the post doesn’t. Perhaps it’s because it’s inside a comment?
I would say finding that the bug is in a library is worse than finding it in your own code.
If it’s your own code, you just fix it.
If it’s in a library you then have to go and search for issues. If there isn’t one, you then go and spend time making one and potentially preparing a minimum reproducible example. Or if you don’t do that (or it’s just unmaintained) then you have to consider downgrading to a version that doesn’t have the bug and potentially losing functionality, or even switching to another library entirely and consequently rewriting all your code that used the old one to work with the new one.
Yeah, I’d take my own bugs over library bugs any day.
I’d happily pay a one-time fee to be able to use my own cloud service like Google Drive, OneDrive, or iCloud.
You can do that without paying. Obsidian vaults are just plaintext files on your disk. Just make a vault in your GDrive/OneDrive/iCloud sync folder and it’ll be synced.
There’s likely a extra hoop or two to jump through if you want mobile access, but it’s not too much extra effort.
Ah I was just referring to my laptop there. I do still use Android, but with LineageOS instead of my device’s stock image.
How am I the product when I bought it outright and installed Linux before ever booting it up?
I guess I didn’t buy my phone or my laptop then?
Last I checked, almost none. They provide a JS API for common functions, so as long as you’re keeping things relatively simple you might not have to touch much Rust at all.
that would break iMessage support on older iOS devices that no longer are supported
Yes, that’s what “no longer supported” means.
Valve is currently a private company, which is likely why they’ve been able to avoid enshittification for so long. All we can do is hope that whoever eventually takes over when Gabe steps down also has his ideals at heart.
Ah I see. I don’t think there’s a way to do that yet.
If you’re so inclined, perhaps you could contribute to the discussion (or development) around tags on Lemmy here, since a feature like that would solve your issue.
I’ve heard Intel chips still run hot, especially the 14th Gen i9. However, I came across this article by Puget Systems (a system integrator who mainly deals with professional workstations rather than gaming rigs) who found that decreasing the PL1, which I assume means Power Level, from 253W to 125W was a good enough tradeoff for performance/heat that it’s the default configuration they ship to their customers.
On the other hand, they still do mention that tasks such as UE light baking, V-Ray, Cinebench, and Blender saw gains of 10-18% when using the higher power limit, which seems much more like what OP’s workload is. Puget then proceed to recommend a CPU with a higher core count like a Threadripper PRO for those kinds of workloads, so perhaps OP really would be better off going AMD for their workstation.