Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

There are two factors at play here; both you and the GP are making points that are correct.

1) As you say, "If you're abusing data, you're going to have a hard time- and that's good." Companies that are built on selling your data (e.g. data brokers in the marketing / finance industry) or sharing it without your consent (e.g. Facebook with Cambridge Analytica) will have to stop those practices. GDPR working as designed, win.

2) For business models that are viable under GDPR, then at the margin GDPR is going to prevent small companies from entering the space, to the benefit of larger companies. Your example of Joe the Shoemaker is the trivial case. What if your business has a need to collect PII, banking information, perform Know Your Customer checks, and retain that data for 5 years under the US Banking Secrecy Act? Or collect electronic personal health information? Or submit to any other conflicting regulatory regime? You're missing the fact that lots of businesses have a legitimate need to collect more than just an email, and that other regulations directly conflict (per country) with GDPR. In these cases, adhering to GDPR is more than just slapping a GDPR dialog onto your email submission modal; it might require a significant amount of time talking to expensive lawyers to figure out how to comply with all of the applicable regulations.

This is the basic dirty truth about regulation; large companies can typically afford to lobby to make sure the regulation isn't going to ruin them, and then they can afford to implement the regulations even if they are very complex. After implementation, regulation like GDPR becomes a moat. Consider how hard it is to start a company in highly-regulated spaces like finance or healthcare. Though I don't claim that GDPR is as deep a moat as those industries' regulation, it's the same idea. The regulations as a whole can still be net-positive to society, but the risk is that when regulators (and those commenting on regulation) don't understand the real costs of complexity, it's easy to pile on rules that have the opposite effect than intended.

Note, a common misconception about Google is that it sells/shares your data; it does not in general do that. Google sells targeted ads, and your data is Google's competitive advantage; Google built Gmail, Android, and a host of other products in order to get data that others cannot; your data is Google's moat. GDPR just talks about sharing your data with other companies; Google is fine under the GDPR. Sure, the death of Privacy Shield might make Google's various international entities less able to share data, but the fundamental business model they follow of collecting first-party data on users is alive and well.



> Or submit to any other conflicting regulatory regime?

I'm pretty sure the regulations have a catchall "you are allowed to store the data if legally required to" for exactly these kinds of issues. Need to store the data for 7 years for tax purposes? Fine.


Sure, Article 6, section 1c (https://gdpr-info.eu/art-6-gdpr/); processing is lawful if "processing is necessary for compliance with a legal obligation to which the controller is subject"

I think (like the GP) you're oversimplifying though.

Have you ever had to determine what is considered a "legal obligation"? It's not fun. Much to the distaste of us engineers, it turns out that most laws are not written in an unambiguous fashion; many (like HIPAA) are very vague in places, and rely on precedent or tribal knowledge about how the regulator in question tends to interpret things.

So yes, if you have a clear, unambiguous requirement to keep Personal Data, then you don't need to lawyer up. If you work in an industry with complex regulations, then you're going to need lawyers and/or consultants to tell you how to resolve the conflict between GDPR and those regulations.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: