Were you using it on some other dudes junk by any chance or something…? Cuz otherwise that seems like a leap.
Were you using it on some other dudes junk by any chance or something…? Cuz otherwise that seems like a leap.
Unfortunately for some of them even if the game works there are often cases where either mods don’t work or some overlay/other additional software.
On your answer though, I was under the impression that when you configure the KVM passthrough setup it makes the video card you use for the passthrough inaccessible for the host itself and that to make it accessible, it requires undoing some of the config and a restart. Is this incorrect?
An app named “Bring!”. It’s pretty barebones, the only few features in it are
It’s pretty much all we need.
Pepperoni, red onion, sliced pickles, mix of mozzarella and cheddar (around 60-40), marinara sauce, medium crust.
In their defense, they also clearly label immich as under active development with frequent changes and bugs.
Edit: nvm I saw it was already discussed in another reply.
Backwards compatibility - yes I agree, it’s quite good at it.
Hardware specific issues for any OSes - disagree. For windows that’s 80-90% done by the hardware manufacturer’s drivers. It’s not through an effort from Microsoft whether issues are fixed or not. For Linux it’s usually an effort of maintainers and if anything, Linux is famous for supporting old hardware that windows no longer works with.
But the point I was making is not to say Linux or osx is better than windows or vice versa, it’s that windows holds by far the largest market share in desktops and neither of the alternatives are really drop-in replacements. So in the end they have no pressure on them to improve UX since it’s infeasible to change OS for the majority of their users at the moment.
Aside from the effort required others have mentioned, there’s also an effect of capitalism.
For a lot of their tech, they have a near-monopoly or at least a very large market share. Take windows from Microsoft. What motivation would they have to fix bugs which impact even 5-10% of their userbase? Their only competition is linux with its’ around 4(?)% market share and osx which requires expensive hardware. Not fixing the bug just makes people annoyed, but 90% won’t leave because they can’t. As long as it doesn’t impact enterprise contracts it’s not worth it to fix it because the time spent doing that is a loss for shareholders, meanwhile new features which can collect data (like copilot for example) that can be sold generate money.
I’m sure even the devs in most places want to make better products and fight management to give them more time to deliver features so they can be better quality - but it’s an exhausting sharp uphill battle which never ends, and at the end of the day the person who made broken feature with data collector 9000 built in will probably get the promotion while the person who fixed 800 5+ year old bugs gets a shout-out on a zoom call.
Doesn’t count until it runs doom.
Not a lawyer but in the scenario where proton closed the source but kept offering the build, even if gpl3 still applies since they’re the only copyright holder (no contributions) it’d only give them grounds to sue themselves?
From gnu.org:
The GNU licenses are copyright licenses; free licenses in general are based on copyright. In most countries only the copyright holders are legally empowered to act against violations.
I haven’t used tailscale to know how well it works but as a current zerotier user I’ve been considering moving away from it.
I actually love the idea and it’s super simple to set up but has some very annoying pitfalls for me:
Pretty much all of the issues I’ve had were with devices that have to disconnect and re-connect from the network and/or devices that move between different networks (like laptop, phone). On my router, it’s been super stable. Point is, your mileage may vary - it’s worth trying but there are definitely issues.
Would you accept a certificate issued by AWS (Amazon)? Or GCP (Google)? Or azure (Microsoft)? Do you visit websites behind cloudflare with CF issued certs? Because all 4 of those certificates are free. There is no identity validation for signing up for any of them really past having access to some payment form (and I don’t even think all of them do even that). And you could argue between those 4 companies it’s about 80-90% of the traffic on the internet these days.
Paid vs free is not a reliable comparison for trust. If anything, non-automated processes where a random engineer just gets the new cert and then hopefully remembers to delete it has a number of risk factors that doesn’t exist with LE (or other ACME supporting providers).
I like to think the behind the scenes is just a decades long game of dare in Mozilla’s leadership that slowly got out of control but they’ve all gotten too deep in it now to give up and just call it a tie.
Not OP but some stores have these hyper-sensitive scales you put your bag/scanned items on. They can be super annoying as tiny differences in the weight will lock up the entire thing and you need someone to unlock it again. E.g. if you didn’t start with all your bags already on it and you try to add a new bag. Or the area is full and you want to remove and already full bag. Or you nudged something with your leg while scanning the next item.
The best thing is grocery stores where they have handheld scanners you can take with you in the store, you scan the item as you put it into your bag and on the way out you just scan the code on the self-service checkout and pay. Least effort possible, plus the scanner doesn’t have the “oh a speck of dust landed on the scales… obviously this means he’s trying to steal shit” issue.
It used to be an open source project, then at some point the developers moved it to closed source. In reaction to this, a couple of people forked the last open source version of emby and launched it as an open source project (again) named jellyfin.
It is still open source and under active development, and has a significant userbase. Especially on Lemmy I think it’s much preferred by people to emby (or at least more vocally supported).
I have no experience with this, but happened to have seen an interview with Ludwig Minelli, the founder of Dignitas (an organisation for assisted death). The man is 90+ and still fighting for this right. I believe I saw it in a video format, but I think this was the interview - I think it’s worth a read.
I’d suggest you look up the contact for the various organisations and reach out with your situation and questions to see what they say. They’re likely to be much better sources of information.
Maybe set up a script that runs locally and pings an external service like 1.1.1.1 or 8.8.8.8 every second to see if it survives in a window when your services alert? Perhaps it’s your modem refreshing some config which causes a blip for a few seconds or something similar. If this doesn’t alert at least you can rule out that your internet fully goes out.
The other side of this would also be useful, if you could run a similar check towards different levels of your home network to see how far down it gets (e.g. ping your router, expose some simple TCP echo service on the server running all this and nc it, curl the status page of the reverse proxy (or set up a static page in it), curl the app behind the reverse proxy - just make sure to use firewall rules for this and not just put everything on the internet). Depending on where it fails should hopefully give you some idea to go on.
Maybe set up https://www.thinkbroadband.com/broadband/monitoring/quality/ to see if it registers any packet loss in those times or increased latency (although I’d still do the above as well)
Thank you for the information! I kind of suspected it’d be like that tbh,
Out of curiosity, how much of the internet is unusable with js disabled? As in, how often do you run into sites that are essentially non-functional without?
“If you get sued for the lies our AI pumped onto your website that we paid you for, it’s on you and nothing to do with us gl hf.”