Fight the EU copyright reform!


#21

Okay some more insight.

This is how political groups have voted:

You can have more insight on which of your national parties voted in which way: (you have to sign up for free there)
http://www.votewatch.eu/en/term8-copyright-in-the-digital-single-market-draft-legislative-resolution-vote-commission-proposal-ordinar.html#/##vote-tabs-list-4

The thing will enter the trilogue now. I dont expect much to change there. Then in spring a final vote will take place, with “accept the whole thing” and “reject the whole thing”. I have not much hope they go for the latter because we would have to convince the MEPs to not only dump the Uploadfilters and Linktax but also all good stuff for artists and journalists within the directive. (If there is anything good)

Also, related: The parliament even clapped after the vote.


#22

Fucking Conservatives ruining everything yet again.


#23

EU is banning Memes…

update%20upgrade

…by declaring it as terrorism!


The proposal also includes Uploadfilters as well.
Wth, is this how EU gets over the Brexit? By turning it into the police state the UK wants to be?


#24

But where does it say memes is declared as terrorism?

The EU’s executive body said “propaganda that prepares, incites or glorifies acts of terrorism” must be taken offline. Content would be flagged up by national authorities, who would issue removal orders to the internet companies hosting it. Those companies would be given one hour to delete it.

Totally agree on this and even Google and Facebook welcomes this:

“We share the European Commission’s desire to react rapidly to terrorist content and keep violent extremism off our platforms,” Google said. “We welcome the focus the Commission is bringing to this and we’ll continue to engage closely with them, member states and law enforcement on this crucial issue.”

Facebook said “there is no place for terrorism” on the social media platform.

For the meme part, nothing would make me more happy if memes disappears from the internet, I don’t like to use memes and I don’t use memes, I don’t think we need memes.


#25

It doesn’t, this appears to be a separate thing. A troubling one nonetheless. You need to be extremely wary of Governments who want you to let them define “extremist content” for you, given Govt track record of declaring peaceful revolutionaries, whistleblowers, anti racist and human rights activists as “extremists”.


#26

I am exaggerating here, but truth is, filters detecting terrorism will work as bad as they will do on copyright infringements. Memes are said to be safe within the copyright directive, but parody might differ many times only in context from actual infringements. And as software cannot understand context, Memes surely will be a huge factor of false positives if filters are engaged.

Same will happen on this anti-terror thing as well. Sure it is not against xyz, but distinguishing xyz from terrorism surely is not as easy as our legislators think.

And remember this is not only affecting Memes but also pretty much any kind of expression online.


#27

That’s not what this is about, it’s not filters that will detect terrorism, it’s the authorities and if the authorities find something about terrorism, they will notify Youtube, Facebook or Twitter that it should be removed and with this new rule, it should be removed within 1 hour:

Content would be flagged up by national authorities, who would issue removal orders to the internet companies hosting it. Those companies would be given one hour to delete it.


#28

Ultimately, it comes down to the fact that you cannot do everything with algorithms. The free and open internet, Social media, YouTube etc have all fundamentally changed the way we communicate, consume media and get our news. It is no longer correct for us to be Dinosaurs about this. You cannot expect the established order of things to work in the same way as they did in the pre-internet era, and you certainly cannot start interfering with the idea of a free and open internet - something that is increasingly being seen as a human right - to fight old battles on behalf of corporate giants. And make no mistake, it is corporations who will benefit the most from this.

I refuse to believe that copyright infringement is a gargantuan problem for which the only solution is to cross-check everything we upload to the internet against a databse. I also think that the looming threat of terrorism is not sufficient justification for mass surveillance. Not only are these dragnet provisions unnecessary, they’re grossly ineffective. The overwhelming majority of people who will be caught up in this web are people who are not actively breaching copyright law. Something as benign as a family’s home video of a wedding party could be flagged and monetized by Universal Music Group automatically if it happens to have a Lady Gaga song playing on the radio in the background. The amount of problems with this are staggering, and the fact it is passing is depressing if not all that surprising.

In some better Tech related news, the ECHR declared GCHQ’s spying programs to be a breach of human rights.

What happens now?


#29

The linked article is not covering everything.

Proposal:


See Article 6, paragraph 2a:

  1. Where it has been informed according to Article 4(9), the competent authority referred to in Article 17(1)( c) shall request the hosting service provider to submit a report, within three months after receipt of the request and thereafter at least on an annual basis, on the specific proactive measures it has taken, including by using automated tools, with a view to:
  • (a) preventing the re-upload of content which has previously been removed or to which access has been disabled because it is considered to be terrorist content

#30

“Those who would give up their continual freedoms for temporary security, deserve neither their freedom nor their security.”
Benjamin Franklin


#31

It did cover this:

Under the proposal, internet companies would have to take measures, including installing automated systems, to prevent content from being re-uploaded after being removed the first time. Companies that fail to comply would face fines of up to 4 percent of their annual global turnover.

And that means, if content have been removed because it’s about terrorism, it will be stopped from been able to upload it again, which could including using automated tools, it’s not about a system that can detect content about terrorism and prevent it for been uploaded, that’s a job for the authorities, the only thing a upload filter can do is to stop it from been uploaded once more after it’s flagged as terrorism and removed.


#32

It really comes down to what content is meant here. The more context is included to this content, the less likely the recognition software with preventing abilities (=Uploadfilter, sorry, that’s what it is) will work flawlessly.

Once there is a content that is not allowed to be uploaded, all content has to be checked if this is the one in question.

It can become worse if the legislators try to include “edited content” into this as well, to prevent terrorists from bypassing these filters.

Also, press that is reporting on terroristic content will have a hard time quoting or including that content or at least in some parts. Which is bad as well.


#33

Why? It is the authorities that will read it before it is flagged as “propaganda that prepares, incites or glorifies acts of terrorism” don’t you think people will se the difference and not flag it for “propaganda that prepares, incites or glorifies acts of terrorism”?

I’m not sure what you are trying to say here, because this is what a upload filter will do:

including installing automated systems, to prevent content from being re-uploaded after being removed the first time.

The upload filter will only prevent deleted content from being upladed again, it will not check everything.

But it’s not filters that flag it in the first place, that’s the authorities and if authorities dosen’t flag this, the upload filter will not stop this from being uploaded again. It must be removed first, before the upload filter can work.


#34

The it that gets declared to be terrorist propaganda is not necessarily the same filters will detect and delete.

Sorry that is not how it works. Everything has to be checked indeed. How else do you find out an upload is the wrong you want to prevent?

Where the hash for the filter is coming from has nothing to do with the problems with filters in principile.
At the copyright reform, the rightholders also first have to feed the filters with data. Not much different.


#35

I don’t disagree, but the quotation culture is a bit of an interesting one to me. We cite a person and be sure to mention them to give validity to the statement, but the argument should be equally as good regardless of who says it.

To expand on this, if I were to say “My wife doesn’t need to know about my girlfriends in Paris -Benjamin Franklin” it’s a shitty argument regardless of not if Mr Franklin thought that or not.


#36

But it will need to check that everything being uploaded is not that one thing. So it’ll cross reference everything that goes up against this database and sometimes it will not be accurate.


#37

Why? It will only prevent re-uploaded content, that has been flagged by the authorities and removed.


#38

A wise man once said,

“don’t believe everything you read on the Internet just because it has a name and a quote next to it” - Abraham Lincoln


#39

The point isn’t who said it the point is the message is apt, who said it is irrelevant. Benjamin Franklin’s sex life is irrelevant to this discussion, his stance as a freedom fighter is though.

If you want my two cents the right in the EU are being reactionary and greedy and in turn suppressing the very morals the represent in order to enact more Jingoist policies or to service big media corporations.

If you want to quote me on this “The EU are a bunch of big dummies acting like they still have empires, bickering with each other, getting nothing real done and we will all lose to China before the decade is out” (That last part is sarcasm)


#40

Okay I make it simple and in small steps:

  1. Government finds terroristic propaganda
  2. Terroristic propaganda is turned into mathemathical comparision data so filters can use it
  3. Millions of users upload files/sending posts
  4. Filter is checking all these uploads with the data it was fed with
  5. 10 results of which maybe 5 are false positives

Other case:

  1. Government finds terroristic propaganda
  2. Terroristic propaganda is turned into mathemathical comparision data so filters can use it
  3. Mass media in the TV quotes propaganda
  4. Millions of users upload files/sending posts, with thousands talking about this propaganda
  5. Filter is checking all these uploads with the data it was fed with
  6. 10000 results of which maybe 5000 are false positives

This is bad.