Medical, banking and insurance are three industries that the European data privacy watchdogs are much more strict about because of the potential for damage.
I'd say the numbers listed here prove the GPs point of poor enforcement. The largest fine is roughly 0.97% of Meta's 2023 revenue, the equivalent of a $600 fine for somebody making 60k / year. It's a tiny-tiny cost of doing business at best, definitely not a deterrent, given Meta's blatant disregard for GDPR since then.
> the equivalent of a $600 fine for somebody making 60k / year
I don't know about you, but on that income I would certainly not brush off such a fine as a "cost of doing business". Would it cause me financial trouble, or would it force me to sacrifice other expenses? Absolutely not. But would I feel frustrated at having to pay it, feel stupid for my mistake, and do my best to avoid it in the future? Absolutely yes.
My bad, a better analogy would be a dealer making 60k / year selling drugs, gets caught by police and is fined $600. I wouldn’t expect them to change much.
1% of Meta's global revenue is a tiny-tiny cost of doing business? At that point, I think I can stop even trying to argue here. It's a massive fine any way you put it. Especially when you consider the ceiling hasn't been reached and non compliance is more and more costly by design.
Their net profit was $60billion in 2024. This is peanuts. It can fluctuate by multiples of this fine in a month, depending on whether or not they've had a bad or good month, nevermind year. This pretty much is just a cost of doing business.
The interesting part is that it keeps going up. You seem to believe we have somehow reached a cap where Meta can just expense it as a cost of doing business. That's not how European law works. The fine maximum is far higher and repeated non compliance keeps making the fines higher and higher. It's a ladder not a sizing precedent.
Unfortunately it doesn't in practice. Meta's total revenue since 2018 when GDPR came into force is just shy of $1T. Even with all the smaller fines combined, the total amount of GDPR related fines is in the range of $3B. It's a rounding error.
There isn't a trend of increasing fines, nor has any fine even reached the cap, let alone applied multiple times for the recurring violations. Even more with the current US administration's foreign policy towards the EU.
While GDPR as a law is fine, with the exception of enforcement limitations, enforcement so far has been a complete joke.
Maximum GDPR fine is 4% of global revenue in the previous year. If a company has 30% profit margin then they can, in theory, treat is as a cost of doing business, indefinitely.
It's 4% per fine. Each violation is a fine and Meta owns multiple companies that can be fined. But 4% of global revenue already can't be treated as just a cost of doing business. Their shareholders would murder them.
So true. Can't wait for NIS2 to be implemented in my location (EU); the new directive allows authorities to hold board members and CEOs personally responsible for cybersec fails (although only as a last resort, after trying other means).
The belt packs typically do a lot more than to amplify ambient noise, they also handle RF, depending on the model decryption of the audio signal, EQ as well as other stuff. All while typically running on 2x1,5V AA batteries.
Audio gear isn't made to last long on batteries, it's made to be reliable for the hours a show typically lasts. I worked part-time as a sound tech (paid hobby) for 15+ years, and I never started a show without fresh batteries, regardless of what the indicators on the transmitters/receivers told me.
Been a while since I looked into this, but afaik Maven Central is run by Sonatype, which happens to be one of the major players for systems related to Supply Chain Security.
From what I remember (a few years old, things may have changed) they required devs to stage packages to a specific test env, packages were inspected not only for malware but also vulnerabilities before being released to the public.
NPM on the other hand... Write a package -> publish. Npm might scan for malware, they might do a few additional checks, but at least back when I looked into it nothing happened proactively.
It depends. If they simply ignore the ruling, my guess is that whatever trade agreements are in place have a mechanism for escalating such violations so that an Israeli court can enforce the order. I also take for granted that it depends a lot on the political climate and the strategic value for the governments of Israel and the US...
Similar to the Shai Hulud attack, but with more sofisticated C2 (blockchain, Google Calendar). It also uses Unicode characters to hide source code in IDEs, harvests ecosystem credentials to infect and publish new versions of packages you have access to, and more.
There could, this would essentially be in the form of a standard library. That would work until someone decides they don't like the form/naming conventions/architecture/ideology/lack of ideology/whatever else and then reinvent everything to do the same, but in a slightly different way.
And before you know it, you have a multitude of distributions to choose from, each with their own issues...
> Deliberately extracting personal data into un-audited environments without good reason (eg printing a label for shipping), should be punished with GDPR-style global turnover-based penalties and jail for those responsible.
There already are, but only for Europeans through the GDPR.
Technically not quite, because even in the EU, you don't have to provide the audit log for someone's data specifically and you as a subject have to make specific requests to delete or retreive your data, it's not make transparent to you as a default position. But yes, you can't just dump it out anywhere you want.
How it should be is that personal data's current and historical disposition is always available to the person in question.
If that's a problem for the company processing the data (other than being fiddly to implement at first), that sounds like the company is up to some shady shit that they want to keep quiet about.
Nothing to hide, nothing to fear should apply here, and companies should be fucking terrified with an existential dread of screwing up their data handling and looking for ways to always avoid handing PII at all costs. The analogy of PII being like radioactive material is a good one. You need excellent processes, excellent reasons to be doing it in the first place, you must show you can do it safely, securely and if you fuck up, you'd better hope your process documentation is top tier or you'll be in the dock. Or, better, you can decide that actually you can make do by handling the nuclear material only in some safer form like encapsulated vitrified blocks at least for most of your processes.
The data processing industry has repeatedly demonstrated they they cannot be trusted and so they should reap the whirlwind.
It doesn't say audited environments as such, but you are required to use secure environments that you control as a basis. What "secure" means can always be discussed, but in general it depends on what data you process and what you do with it; if it is a large volume/big population/article 9-data auditable environments should be expected - though not publicly auditable. Although that would be nice...
Fully agree on what you are saying, and my popcorn is ready for August when the penalties part of the AI Act comes into force. There is a grace period for two years for certain systems already on the market, but any new model introduced after August this year has to be compliant. AI Act+GDPR will be a great show to watch...
The machine: https://en.wikipedia.org/wiki/Memex
reply