paxys 19 hours ago

Remember that Cambridge Analytica was "research" as well. Laws like these sound good on paper, but it's the company that has to deal with the fallout when the data is used improperly. Unless the government can also come up with a fool proof framework for data sharing and enforce adequate protections, it's always going to be better for the companies to just say no and eat the fines.

  • verst 18 hours ago

    As I recall it Cambridge Analytica was a ton of OAuth apps (mostly games and quizzes) requesting all or most account permissions and then sharing this account data (the access for which had been expressly (foolishly) granted by the user) with a third-party data aggregator, namely Cambridge Analytica. Only this re-sharing of data with a third party was against Facebook Terms of Service.

    I would not classify Cambridge Analytica as research. They were a data broker that used the data for political polling.

    • paxys 17 hours ago

      From https://en.wikipedia.org/wiki/Cambridge_Analytica

      > The New York Times and The Observer reported that the company had acquired and used personal data about Facebook users from an external researcher who had told Facebook he was collecting it for academic purposes.

      • tguvot 16 hours ago

        link from sentence that you copy pasted https://en.wikipedia.org/wiki/Facebook%E2%80%93Cambridge_Ana...

        The data was collected through an app called "This Is Your Digital Life", developed by data scientist Aleksandr Kogan and his company Global Science Research in 2013.[2] The app consisted of a series of questions to build psychological profiles on users, and collected the personal data of the users' Facebook friends via Facebook's Open Graph platform.[2] The app harvested the data of up to 87 million Facebook profiles

        • pms 16 hours ago

          This "research" and data access wouldn't be allowed under the DSA, because (i) the researcher didn't provide any data protection safeguards, (ii) his university (and data protection officer) didn't assume legal liability for his research, (iii) his research isn't focused on systemic risks to society.

          • tguvot 16 hours ago

            not sure what's the point that you are making. but under "common sense comments act of 2054" unclear comments are not allowed.

            • brendoelfrendo 16 hours ago

              The article for this post is about the EU's Digital Services Act (DSA). Since the original comment argues against research access to data by arguing that "Cambridge Analytica was research as well," another poster chimed in to rebut that assertion by arguing that Aleksandr Kogan's research would not have been allowed access to user data under the DSA and thus, that specific legal concern is moot.

              • tguvot 15 hours ago

                kogan "research" harvested data through application and he was outside of eu.

                so even it was happening today, whatever he did is irrelevant to EU/DSA unless they plan to chase everybody across the globe. somewhat like ofcom going after 4chan

                • shakna 12 hours ago

                  That's precisely what the EU is doing with Clearview AI [0].

                  > Max Schrems: “We even run cross-border criminal procedures for stolen bikes, so we hope that the public prosecutor also takes action when the personal data of billions of people was stolen – as has been confirmed by multiple authorities.”

                  [0] https://noyb.eu/en/criminal-complaint-against-facial-recogni...

                  • tguvot 11 hours ago

                    Based on your quote looks like this is what eu not doing.

                    I like this quote more

                    Max Schrems: “Clearview AI seems to simply ignore EU fundamental rights and just spits in the face of EU authorities.”

                    • shakna 11 hours ago

                      Hence why the upgrade to criminal charges against the company's officials.

                      There is _not_ a lack of action on behalf of the EU, here. They are "chasing" those responsible.

                      • tguvot 10 hours ago

                        Ohh.. the upgrade will surely make them rethink the error of their ways and will come begging EU for forgiveness.

  • shakna 18 hours ago

    Which is why the EU doesn't use a letter-of-the-law system, and also have an ethics regulation system.

    So it falls on those misusing the data, unless you knew it would be misused but collected it anyway.

    Golden rule: Don't need the data? Don't collect it.

  • _--__--__ 18 hours ago

    I don't think you get it: the EU has a law that says these researchers need to find casus belli to wrestle the norms of online freedom of speech away from American corporations. Therefore they get to request data on every account that has ever interacted with certain political parties on those platforms, as a treat.

    • jack_tripper 8 hours ago

      >EU has a law that says these researchers need to find casus belli

      Like pepe the frog memes

  • pms 16 hours ago

    Long story short, this "research" and data access wouldn't be allowed under the DSA, because (i) the researcher didn't provide any data protection safeguards, (ii) his university (and their data protection officer) didn't assume legal liability for his research, (iii) his research isn't focused on systemic risks to society.

    • loeg 16 hours ago

      Platforms (reasonably!) do not trust random academic researchers to be safe custodians of user data. The area of research focus and assumption of liability do not matter. Once a researcher's copy of data is leaked, the damage is done.

      • whatevertrevor 13 hours ago

        Yup, when the data breach happens the headlines aren't going to be "Random well meaning researchers caught in data breach exposing user data". They're going to be: "5 million Facebook logins hacked in massive data breach", and you'd be hard pressed to find actual information on how the leak happened, just like the gmail story from a few days ago.

        • pms 6 hours ago

          No researcher will request or get access to "5 million Facebook logins" through the DSA, since such a request wouldn't comply with the DSA requirements, so your point is moot. In fact, we live in a quite different world than you imagine. Currently, researchers don't even have access to the public data, as the article points out. When it comes to private data, researchers won't get access to private messages either, but rather to aggregate-level privacy-preserving data (assuming that the DSA isn't killed before any of this happens by the industry and Republicans, which you seem to advocate for).

  • pms 17 hours ago

    Republicans and Elon Musk have become very skilled at exerting political influence in the US [1] and Europe [2] through social media in ways the public isn't really aware of. Is this really that far from the goal of Cambridge Analytica of influencing elections without people's knowledge? Is it fine for large online platforms to influence election outcomes? Why wouldn't an online platform be used to this end if that's beneficial for it and there is no regulation discouraging it?

    [1] https://www.techpolicy.press/x-polls-skew-political-realitie...

    [2] https://zenodo.org/records/14880275

    • santadays 16 hours ago

      I can’t imagine this is not happening. There exists the will, the means and the motivation, with not a small dose of what pg might call naughtiness.

    • terminalshort 16 hours ago

      I can't stand this "influencing elections" nonsense. It's a term meant to mislead with connotations of manipulating the voting tabulation when what is actually going on is influencing people to vote the way you want them to, which is perfectly legal and must always be legal in a functioning democracy.

      • adamors 13 hours ago

        It’s so perfectly legal that other nation states can do it for you and you might end up with a winner that people didn’t even hear about before.

        https://en.wikipedia.org/wiki/2024_Romanian_presidential_ele...

        • terminalshort 13 hours ago

          Who are these "people" that you claim never heard of him before? Kind of sounds like BS to me because, you know, he won the election...

          • debugnik 8 hours ago

            Won the election here means 22.94% of the vote, not long after having polled at 5%. He didn't even have a party.

            This person went really quickly from unknown to anyone not into politics, to incredibly popular within a targeted media bubble and still unknown to anyone unfamiliar with that bubble.

            • terminalshort 4 hours ago

              When that "bubble" is at minimum one quarter of the country, maybe you need to reconsider who is in the bubble here.

              • tpm 4 hours ago

                it's not "at minimum one quarter of the country", the turnout was 52.56% of registered voters, so its at minimum 22.94% of 52.56% of 18 million, which is 11% of the total population of Romania.

      • pms 7 hours ago

        It's obviously not about "manipulating the voting tabulation". Influencing people to vote the way you want them to is fine as long as it's not based on deceit. Is this what you can't stand?

        • terminalshort 5 hours ago

          I hate it because I have never seen it used any other way than if my side does it, it's "campaigning," and when the other side does it, it's "influencing elections."

          • pms 2 hours ago

            Agreed that's inconsistent and not ok. We need rules, procedures, and systems that apply equally to different sides.

dmix 18 hours ago

I'd hate to be the engineer that has to deal with these requests. Not even a formal government investigation, just any number of random 3rd party researchers demanding specialized access.

  • pms 18 hours ago

    If applications and datasets are fragmented, then that's going to be a nightmare for all stakeholders, including:

    * researchers, because they will have to write data access applications, including a sufficient description of planned safeguards, detailed enough to the point that their university is ready to take a legal liability (and you can imagine how easy this will be), and

    * digital service coordinators, because it will take ages for them to process applications from thousands of researchers each requesting a slightly different dataset.

    In the end, we need to develop standardized datasets across platforms and to streamline data access processes so that they're safe and efficient.

  • paxys 17 hours ago

    Engineers don't deal with the requests, lawyers do. No regular engineer at any big tech company is ever going to be in a position where they are responsible for such decisions.

    • lanyard-textile 17 hours ago

      Legal will be the face of it, but engineers often handle the actual underlying request.

      Over a couple large public companies, I’ve had to react to a court ruling and stop an account’s actions, work with the CA FTB for document requests, provide account activity for evidence in a case, things like that.

    • thrwaway55 16 hours ago

      Uhhhhhhhh who do you think builds those tools to enforce the things legal stamps.

      Delete all docs we aren't legally required to retain on topic Y before we get formally subpoena'd. We expect it to be on XXX based on our contact.

  • Nextgrid 18 hours ago

    The key is to make all the data public so there is no concept of “specialized access” and then you’re golden.

  • consumer451 18 hours ago

    > Engineer

    Let me know when devs get stamps that make them legally liable for their decisions. Only then will that honor be applicable to software.

    • tdb7893 17 hours ago

      Most of my friends are mechanical or aerospace engineers and it's all the same job in a different medium (many do a significant amount of software as part of their work). They don't have stamps and aren't any more legally liable than we are and staying we aren't engineers just seems to be a misunderstanding of what engineering is.

      • consumer451 17 hours ago

        I grew up in a structural and civil engineering family. My issue is that there is no path to "professional engineer" or "architect" in software, which as a Hammurabi old, makes me suspect of the entire profession. I am involved in software dev, and I would never call it engineering. This might be niche semantics, and yet it feels very important to me.

        https://en.wikipedia.org/wiki/Regulation_and_licensure_in_en...

        https://en.wikipedia.org/wiki/Engineering_law

        https://en.wikipedia.org/wiki/Code_of_Hammurabi

        • zeroonetwothree 17 hours ago

          Yes well unfortunately you aren’t the English language semantics overlord. So it doesn’t much matter what you think compared to general usage.

          • consumer451 17 hours ago

            That's "literally" fine with me.

        • terminalshort 16 hours ago

          If your definition of legitimacy rests on credentials rather than skill your personality sounds more suited for a lawyer than an engineer to me.

          • consumer451 15 hours ago

            Personal insults aside, I hear you. In an attempt to get on the same page: do you know why the Code of Hammurabi called out deficient architecture/engineering?

            This is not a gotcha. My understanding is that bad physical engineering kills people. Is that your understanding as well?

            As software takes over more and more control of everything... do you see what I am getting at? Or, not at all?

            To be clear, my understanding is that physical professional engineer (PE) legal responsibility is not like the medical ethical code of "do no harm." It's just follow best practices and adopted standards, don't allow test:test login on things like fighter jets, etc. If you fail that, then there may be legal repercussions.

            We have allowed software "engineering" to skip all levels of basic responsibility, haven't we?

            • terminalshort 15 hours ago

              I do. And I get where the Hammurabi Code is coming from. But note that it makes no mention of credentials or who is allowed to do the engineering. Only that if you screw it up the penalty is death.

              And I suspect that if you instituted such a system today the results wouldn't be what you like. Failures in complex engineering are typically multiple failures that happen simultaneously when any individual failure would have been non fatal. The bugs are lurking always and when different systems interact in unpredictable ways you get a catastrophic failure. And the way that N systems can interact is on the order of 2^N, so it's impossible to think of everything. Applying the Hammurabi Code to software engineering wouldn't lead to safer software, it would lead to every engineer getting a lottery ticket every time they push a feature, and if the numbers come up you die.

              • consumer451 15 hours ago

                Apologies, I didn't get your reply before my last edit, as I do that kind of live revision. Certainly not a comment "engineer" over here... lol?

                It's not about being perfect, it's about for example "Sr Dev stamped this release, if you can login with test:test after his stamp, he can get sued." Basic stuff, based on industry standards. In physical, these basic standards are not optional. In software, YOLO, role your own auth or crypto? YOLO! ship now you moron! (This is a lesson I am still trying to learn, as it's a good lesson, aside from auth and cryptography.. is there anything else like this? The fact that I have to ask this question is an indictment of the profession.)

                I realize how long it would take our entire industry to adjust to a real engineering standard. We will probably need to lose/kill many more lives before doing so, like every other industry/profession has done prior.

                Ideally, in the end, the YOLO management mentality will die out on core stuff, as real engineering responsibility takes over. Certain core software orgs will look a lot like your local structural firm: all real engineers, with legal liability up to the top.

        • noir_lord 17 hours ago

          I was an industrial electrician before I was a paid programmer.

          I worked with engineers, what we generally do isn’t engineering by the standards of those engineers.

          Which isn’t to say that all software development isn’t.

          People writing avionics software and medical software etc are doing what I’d recognise as engineering.

          It’s about the process more than anything.

          Software in the wild is simply a young field and we aren’t there yet widely.

          • consumer451 17 hours ago

            Related:

            > Collins Aerospace: Sending text messages to the cockpit with test:test

            https://news.ycombinator.com/item?id=45747804

            ___

            Think about how physical engineering orgs are formed. It's a collective of engineers, as it should be. The reason is that zero consequence management abstraction layers cannot exist in a realm with true legal responsibility. Real engineering org structure is written in blood.

            I wonder what it will take for software to face that reality. I know that lack of regulation leads to faster everything, and I really do appreciate and love that... but as software continues to eat the world, there will be real consequences eventually, right?

            The reason that real engineering liability goes back to at least the Code of Hammurabi is that people got killed by bad decisions and corner cutting.

            What will that look like in software history?

          • cwillu 16 hours ago

            In the 90's, “Software is a young field” had a point. In the 2020's though, I think we have to start admitting to ourselves that it is largely an immature/developmentally-delayed field.

    • rcbdev 11 hours ago

      In Austria, and I assumed other parts of the developed world, you can of course become a software civil engineer. (Ziviltechniker) It's a rather exhausting process requiring at least a graduate degree in the field, a few years of apprenticeship with licensed civil engineer and multiple examinations leading to becoming licensed.

      But as a reward, you get a shiny stamp with our governments sigil for you to put under your professional judgements, giving them the same significance in a court as legal notices or attestations.

ggm 17 hours ago

American corporations don't want to accede to european rules about access to data, but it would be grossly simplistic to say all the problem lies on either side. I am not an EU resident but I tend to think there are sound reasons for some of what they want, and as American corporate entities the majors are bound in some tricky issues around US state department expectations, FTC expectations, but it isn't just trade, it's also civil liberties and privacy-vs-security political autonomy issues.

I would have preferred the companies like this emerged as federated entities and european data stayed in european DC and was subject to european laws. I think it would have avoided a lot of this, if they had not constructed themselves to be a single US corporate sheild, with transfer pricing on the side to maximise profit.

sva_ 16 hours ago

I'm not really sure what the argument is for letting these researchers make users of these platforms non-consenting participants in their studies? Is it even possible to opt out?

  • MiiMe19 16 hours ago

    Nope :D The EU has determined that you WILL be a part of these studies!

  • _el1s7 16 hours ago

    Yes, there is an opt out, make your profile private.

  • xyzal 10 hours ago

    Well, I am far more concerned with a secretive megacorp shaping public discourse. The requested transparency is a means to assess impartiality or bias of such platforms.

Workaccount2 18 hours ago

Europe doing everything possible to scare away modern industry.

  • pms 17 hours ago

    I hope it's quite the opposite, since this can lead to innovation as we figure out how to depolarize social media platforms or how to develop more politically neutral online media.

    • mc32 17 hours ago

      I agree on the need for depolarization (go back to timelines and get rid of recommendation engines) but once you cede control of content to government even if it's for things people would agree on like "nuke that misinformation" you will end up being a mouthpiece for the government in power -whoever it be. Look at how "innocent" embeds wholly shaped the messaging on Covid and how they sidelined all dissent (lots of people of renown and stature within relevant disciplines)

      • pms 17 hours ago

        That's a great point. I agree that's a danger, but please note DSA doesn't cede the control of content to government, but rather it creates an institution of (national) Digital Service Coordinators (DSCs) that decide whether a researcher's access to data is well-reasoned. In most cases that institution will be in a different country (the country of company's EU's HQs) than the researcher. That said, there could be malicious players involved, e.g., researchers and respective DSCs secretly recruited by a government to influence elections. This, however, sounds implausible, since in principle both the DSCs and researchers are independent from national governments.

        Also, we can have depolarized recommendation algorithms. We don't need to go back all the way to timelines.

        • thesmtsolver 16 hours ago

          That's quite optimistic given EU's track record in practice.

          E.g., DieselGate. Europe was more impacted but US caught Volkswagen cheating.

          https://en.wikipedia.org/wiki/Volkswagen_emissions_scandal#E...

          • pms 16 hours ago

            It's also quite optimistic to think that the industry will self-regulate, as the recent history of Boeing 737 MAX shows...

            • thesmtsolver 16 hours ago

              No one is saying that the industry will self-regulate. There is a right amount of regulation and all evidence points that the EU is over that limit. The US is below (probably) but closer.

              • pms 6 hours ago

                That's not clear to me at all. Would you mind elaborating?

                • Workaccount2 4 hours ago

                  Europe has a borderline shrinking economy and failed to become a player in the advanced technologies driving the world economy today. It's because they created an extremely hostile business environment, leading to founders going to the US to start their business there instead.

                  So now Europe is a continent full of people using American made software, running on American and Chinese made hardware, going on American social media to talk about how Europe is totally fine because they only work 9 months out of the year and don't allow young people to become billionaires.

                  • pms 12 minutes ago

                    I do think Europe should become more self-reliant. I also work a lot and rarely take any vacation time, but I don't see any sense -- except for egoism -- in working so much just to spread more AI slop online, mislead and polarize societies, or support a genocide. I think Europe has learned its lesson during the World Wars.

            • mc32 16 hours ago

              It’s also optimistic to think the gov will do what’s good for the people as exemplified by Chernobyl.

              There is no “good” answer. Each has its pros and cons.

              • pms 6 hours ago

                Yes, which is why we need to balance who has power, and facilitate independent research, rather than to give away to either industry or government.

        • terminalshort 16 hours ago

          This reads to me like: "please note DSA doesn't cede the control of content to government, but rather it creates a more obfuscated and shady government that pretends not to be a government, but is actually 10x worse and completely devoid of democratic control, and then it cedes control to that."

    • terminalshort 16 hours ago

      What you want is censorship. Using words like "depolarize" to hide it doesn't fool anyone. The polarization comes not from the platform, but from its users.

      • xyzal 10 hours ago

        Censorship from an American perspective? Maybe. We used to have similar views on free speech up until WW2, the rise of propaganda and methods of mass manipulation. Since then e.g. threatening speech is beyond the line.

        But don't worry. You will learn your lesson soon. Remember, your leader says you don't have to go vote anymore.

        • terminalshort 5 hours ago

          You Europeans are the ones who saw the Nazis and decided to take a page out of their playbook, not us. Saying you are censoring "to prevent Nazis" means nothing to me because I've seen the people you call "Nazis". You are just another person who pretends to favor democracy, but in reality only means if those damn voters vote for who you tell them to.

        • jack_tripper 8 hours ago

          >We used to have similar views on free speech up until WW2

          WW2 in Europe and the fascism and nazism that came to power and lead to it, wasn't due to European countries having too much free speech, but the contrary, they had not enough to act as a blowoff valve, so the people lashed out by electing stronger autocrats opposing the status quo to kick out the previous autocrats as revenge to the establishment. Same mindset that got Trump elected twice.

          It's not like pre-Hitler and pre-Mussolini Germans and Italians were living in a golden age of freedom and prosperity, and then suddenly out of the blue with no warning, Hitler and Mussolini just randomly came to power for no reason because their people had too much free speech and one day decided to throw it all away for shits and giggles.

          And today, history in Europe is repeating itself again, with unpopular governments failing their voters, trying to prevent the rise of right wing with censorship, which in turn further radicalizes the people against those in power, leading the rise of the right wing just like in pre-WW2. Nothing was learned, we're doomed to keep repeating the same mistakes every 100 years or so.

      • pms 6 hours ago

        Please don't put your words in my mouth. You can speak only for yourself. And, frankly, it's not ok to write false information about a person you don't know.

        • terminalshort 4 hours ago

          I haven't put any words in your mouth. You want an authority to control what people are allowed to say and/or what people are allowed to hear. That's censorship. It makes no difference what words you use to convince yourself you aren't the authoritarian.

  • anigbrowl 17 hours ago

    What's the product of this industry? It certainly generates huge negative externalities.

  • MangoToupe 14 hours ago

    Please, take meta off our hands. I'm begging y'all. What a cancer on humanity.

darqis 6 hours ago

What about X? You have to pay to use the API

abtinf 18 hours ago

Chat Control is a never ending battle.

hnidiots3 13 hours ago

Hmmm apparently EU Commission is not a bully (despite it acting so) HN very defensive of wanting to allow people to research and use your data.

hexage1814 18 hours ago

Do their own scraping, for God's sake.

  • pms 18 hours ago

    Except platforms don't allow it and have sued for it...

    https://www.google.com/search?q=x+sues+researchers

    • Esophagus4 17 hours ago

      It's a shame the legal system favors big money, because courts typically rule that scraping public data is not against the law[1]. In spite of how much the platforms protest.

      Sadly, big companies can bully the scrapers with nuisance lawsuits as long as they wear the scrapers down in legal costs before it gets to trial.

      [1]https://www.proskauer.com/release/proskauer-secures-dismissa...

      • pms 6 hours ago

        That's exactly the reason why we need platform transparency and accountability regulation like the EU's DSA.

    • loeg 16 hours ago

      Good!

      • pms 6 hours ago

        Please elaborate.

nothrowaways 16 hours ago

They left out X because he will bitch_ about it to his 600 million followers lol.

  • pms 6 hours ago

    Did you read the article? X was first to be accused by the EU Commission.

Nio1024 15 hours ago

Because of the EU’s strict regulations, many internet products have simply given up on the European market. In some ways, this makes Europe seem a bit “behind the times.” Of course, the world still needs some conservatives to keep things in balance.

  • pms 6 hours ago

    I'd say that's fine. That's a great opportunity for the EU to develop its industry.

charcircuit 17 hours ago

Allowing mass scraping like researchers want will further push people away from public platforms towards private group chats like Discord.

user3939382 17 hours ago

These services shouldn’t exist in the first place.

SilverElfin 18 hours ago

Sorry but this sounds like a privacy nightmare. No one should trust random researchers with mountains of data. They clearly won’t know how to secure it. But also the angle of trying to police information and elections is disturbing, especially after several instances of the EU interfering in elections in recent times.

  • anigbrowl 17 hours ago

    The platforms themselves strike many as a privacy nightmare. I'm not aware of any mass data breaches that can be attributed to poor academic security in recent memory.

    instances of the EU interfering in elections

    Do tell.

    • loeg 16 hours ago

      Eh. Users choose to use the platform. Users concerned about privacy don't have to use it.

      How can I, an end user who doesn't trust the ability of these researchers to keep my data private, prevent the platform from sharing my data with them?

  • anonymousDan 18 hours ago

    You talk as if the US hasn't attempted to interfere in elections.

    If online ads can be trivially used by big US tech companies to sway our elections using misinformation without it being observable to anyone or possible to refute (as would be the case for newspaper or TV ads) then why shouldn't it be monitored?

pms 17 hours ago

Republicans and Elon Musk have become very skilled at exerting political influence in the US [1] and Europe [2] through social media in ways the public isn't really aware of. Is this really that far from the goal of Cambridge Analytica of influencing elections without people's knowledge? Is it fine for large online platforms to influence election outcomes? Why wouldn't an online platform be used to this end if that's beneficial for it and there is no regulation discouraging it?

[1] https://www.techpolicy.press/x-polls-skew-political-realitie...

[2] https://zenodo.org/records/14880275

  • pms 7 hours ago

    I'm very curious why this is downvoted. Would any of the down-voters mind elaborating?