I always feel a bit weird when people ask me what I do in my own spare time and my answer is basically fixing my shit, then pushing it just hard enough that it breaks again.
I always feel a bit weird when people ask me what I do in my own spare time and my answer is basically fixing my shit, then pushing it just hard enough that it breaks again.
Unfortunately, copyright protection doesn’t extend that far. AI training is almost certainly fair use if it is copying at all. Styles and the like cannot be copyrighted, so even if an AI creates a work in the style of someone else, it is extremely unlikely that the output would be so similar as to be in violation of copyright. Though I do feel that it is unethical to intentionally try to reproduce someone’s style, especially if you’re doing it for commercial gain. But that is not illegal unless you try to say that you are that artist.
https://www.eff.org/deeplinks/2023/04/how-we-think-about-copyright-and-ai-art-0
So at the bare minimum, a mechanism needs to be provided for retroactively removing works that would have been opted out of commercial usage if the option had been available and the rights holders had been informed about the commercial intentions of the project.
If you do this, you limit access to AI tools exclusively to big companies. They already employ enough artists to create a useful AI generator, they’ll simply add that the artist agrees for their work to be used in training to the employment contract. After a while, the only people who have access to reasonably good AI is are those major corporations, and they’ll leverage that to depress wages and control employees.
The WGA’s idea that the direct output of an AI is uncopyrightable doesn’t distort things so heavily in favor of Disney and Hasbro. It’s also more legally actionable. You don’t name Microsoft Word as the editor of a novel because you used spell check even if it corrected the spelling and grammar of every word. Naturally you don’t name generative AI as an author or creator.
Though the above argument only really applies when you have strong unions willing to fight for workers, and with how gutted they are in the US, I don’t think that will be the standard.
eg wind generator parks take up a lot of space
Though the vast majority of this space can still be used. I live near a wind farm and the area under the turbines still is ranchland. There are cows just chilling under them. The wind company pays farmers for the land in a long term lease agreement: https://www.wri.org/insights/how-wind-turbines-are-providing-safety-net-rural-farmers.
Well, the typical way of measuring q does measure the energy it takes to get the boulder up the hill, but not the inefficiency of the machine to get the boulder up there and the ineffency in extracting its energy as it goes back down.
There’s a lot of unsexy research that could make fusion come a whole lot sooner. More efficient powerful lasers, better cooling methods and design for superconducting electromagnetics, more efficient containment methods and more thought on how to extract energy from the plasma efficiently, and then making it cheap enough to build and maintain that we can actually afford to build them.
You’re right, copyright won’t fix it, copyright will just enable large companies to activate more of their work extract more from the creative space.
But who will benefit the most from AI? The artists seem to be getting screwed right now, and I’m pretty sure that Hasbro and Disney will love to cut costs and lay off artists as soon as this blows over.
Technology is capital, and in a capitalist system, that goes to benefit the holders of that capital. No matter how you cut it, laborers including artists are the ones who will get screwed.
Hopefully, it can also be used to prove that someone was not at the scene of a crime, enabling prosecutors to rule out suspects and innocent people to get off.
The person outright rejects defederation as a solution when it IS the solution
It’s the solution in the sense that it removes it from view of users of the mainstream instances. It is not a solution to the overall problem of CSAM and the child abuse that creates such material. There is an argument to be made that is the only responsibility of instance admins, and that past that is the responsibility of law enforcement. This is sensible, but it invites law enforcement to start overtly trawling the Fediverse for offending content, and create an uncomfortable situation for admins and users, as they will go after admins who simply do not have the tools to effectively monitor for CSAM.
Defederation also obviously does not prevent users of the instance from posting CSAM. Admins even unknowingly having CSAM on their instance can easily lead to the admins being prosecuted and the instance taken down. Section 230 does not apply to material illegal on a federal level, and SESTA requires removal of material that violates even state level sex trafficking laws.
Or you’re me, and can use neither trouble-free. I’m basically this man.
Wonder if Businesses will replace the twitter logo in their windows as well.
I doubt they will for a while at least. This change was so sudden that a lot of people will just not know what X is. It doesn’t look like a social media icon and a lot of people will just not be familiar with it.
It’s also horribly forgettable, even if I did use X regularly, I might just forget what the icon looked like out of context.
Uploaded videos are now X videos.
Bungie is the developers of the Halo and Destiny franchises.
If a platform is only usable by someone of average tech savviness, then its reach is limited to half the population. This would be extremely limiting for a mass market product. As such, platforms cater to the least savvy 20% or lower.
The daughter plead guilty, scheduled to Jul 20th for sentencing. The mother is going to trial on the 14th. If convicted, she’ll get another sentencing hearing.
As a 70-qubit quantum computer, it’s not going to be doing many helpful calculations. The benchmark used is random circuit sampling, which is doing a bunch of random quantum operations, and then reading the result, and it is compared to a supercomputer simulating the various random operations. This algorithm isn’t useful outside of benchmarking.
This also makes Sycamore a particularly ineffective “weapon” considering that we don’t really use encryption that’s less than 1024 bits, which is well outside of the capability of our current quantum computers.
It’s a 70-qubit quantum computer. It doesn’t have enough memory to break even rudimentary 128-bit encryption.
The algorithm that it executed was also not Shor’s algorithm (the one that could potentially break encryption). The benchmark used is called random circuit sampling, which is just doing a bunch of random quantum operations between pairs of qubits and then reading the output. It’s one of the fastest quantum speedups of any known algorithm.
What will happen to 99% Invisible?
True, I wrote this from a US law perspective, where that kind of behavior is expressly protected. US law is also written specifically to protect things like search engines and aggregators to prevent services like Google from getting sued for their blurbs, but it’s likely also a defense for AI.
Regardless of if it should be illegal or not, I feel that AI training and use is currently legal under current US law. And as a US company, dragging OpenAI to UK courts and extracting payment from them would be difficult for all but the most monied artists.