• 0 Posts
  • 20 Comments
Joined 1 year ago
cake
Cake day: June 9th, 2023

help-circle




  • until 0.19.4 is released, clients are supposed to suppress comment contents when the comment is either marked as removed (moderator) or deleted (creator).

    they might decide to show contents to site admins or community moderators anyway, but some clients did not implement this properly and show the original content to all users.

    this is of course not something that should have been available to everyone in the first place, which is why this is being fixed in 0.19.4.

    depending on the client, you should still see some kind of indicator above the comment text that shows it was removed or deleted, in this case removed.



  • I have a large library of games I’ve never played on stream. a couple months back I wanted to play a game I had installed a while ago and guess what, forced always online. not from steam, but from the shitty team behind doom (don’t remember which version it was), which just happened to be at the time I had a multi hour internet outage.
    afterwards I figured out I had to explicitly block some network traffic to stop it from trying to force me to sign up for an account with the developer.

    while steam certainly has DRM options, they are configurable by developers and afaik can’t enforce an always online requirement with just steam, only though custom logic in the game or third party DRM. developers are also free to not use steam DRM.

    DRM, as usual, harms the legitimate buyers.

    that being said, steam still does bring a lot of value, such as their hardware developments, their work on better Linux gaming support, the update distribution through a trusted source, and various others.


  • you’re not getting banned from steam, you’re generally getting banned from participating in anti cheat secured lobbies of a single game or a group of games.

    single player experience is generally not affected.

    having a 3 strike system before getting banned from multiplayer just means it’s 66% cheaper for a cheater to get a new copy of the game.

    this is also not new and has been the case for the current family sharing system as well.



  • The 90 days disclosure you’re referencing, which I believe is primarily popularized by Google’s Project Zero process, is the time from when someone discovers and reports a vulnerability to the time it will be published by the reporter if there is no disclosure by the vendor by then.

    The disclosure by the vendor to their users (people running Lemmy instances in this case) is a completely separate topic, and, depending on the context, tends to happen quite differently from vendor to vendor.

    As an example, GitLab publishes security advisories the day the fixed version is released, e.g. https://about.gitlab.com/releases/2024/01/11/critical-security-release-gitlab-16-7-2-released/.
    Some vendors will choose to release a new version, wait a few weeks or so, then publish a security advisory about issues addressed in the previous release. One company I’ve frequently seen this with is Atlassian. This is also what happened with Lemmy in this case.

    As Lemmy is an open source project, anyone could go and review all commits for potential security impact and to determine whether something may be exploitable. This would similarly apply to any other open source project, regardless of whether the commit is pushed some time between releases or just before a release. If someone is determined enough and spends time on this they’ll be able to find vulnerabilities in various projects before an advisory is published.

    The “responsible” alternative for this would have been to publish an advisory at the time it was previously privately disclosed to admins of larger instances, which was right around the christmas holidays, when many people would already be preoccupied with other things in their life.







  • for our admin team, we’re using a bot to message a matrix room when content is reported and reacting to the message when it’s been handled.

    this could be done pretty much the same way on mod level, though this is certainly not easily accessible to everyone due to the hosting involved.

    and all of this is only relevant if you even receive reports about content in the first place. if you moderate a community on another instance, tough luck unfortunately, as they currently do not federate.

    edit: typos