Ietf turns against unavoidable concentration on the internet

IETF turns against unavoidable concentration on the Internet

The Internet Engineering Task Force (IETF) should be more intense with the topic Unwise concentration and consolidation of services on the Internet take. The called party of the virtual 110. Meeting of the standardization organization. The IETF should develop guidelines to prevent negative concentration processes by means of suitable standardization.

As a proof of unwanted concentration, quantities of a research team can be used. Coaches of the University of Southern California (USC), Sidn Labs, Internetnz and the University of Twente. Accordingly, address zones such as the Dutch .NL only a five grude CloudProvider for one third of all DNS inquiries responsible. And these fun trough are rough anyway – it is Google, Microsoft, Amazon, Facebook and Cloudflare.

blessing or curse

Concentration and consolidation also had positive pages, underlined at the IETF conference Wes Hardaker from the Institute for Information Science Institutes of the USC. Coarse providers can quickly secure or modernize the services by using new specifications such as IPv6, DNSSEC or QName Minimization – if you want this. But Hardaker also states that the concentration of important services is a problem for a few rough players for confidentiality and resilience of the entire Internet.

Because centralization "Single Points of Failure" calls, the failure of individual whiskers can have devastating consequences for many users. For example, in 2016, many users could not have prominent websites such as Twitter, Spotify, Soundcloud or New York Times to access, because at that time the rough DNS provider DYN has been hit by a violent DDOS attack.

On the other hand, many representatives and employees of the open technical standardization organization are not established for competitive ies. On how protocols are converted and used, they had no influence, they are lively "not the Internet Police".

Develop in the sense of decentralization

However, many prerequisites for specifications are available "Competing technical approaches or features", which are quite different deployment models can demand. The former IETF chairman Jari Arkko writes in a recently published Internet Architecture Board report. Occasionally one can very well choose whether a new standard existing business models are supported or whether features are provided, the smaller players facilitated the interaction, or whether one is even anchored a disruptive concept.

However, according to Arkko, the IETF preserves the feedback from institutions and companies that translate the specifications in practice, often too Spat. Even end users reported Junge or Wunsche to Spat.

With a concrete case, as a standard can overpower an unfortunate centralization, experts for the domain name system are currently working around in the IETF. After all, two years ago, the DNS traffic with Resolverns Via HTTPS (Doh) (DoH) in record time had been standardized two years ago, the DNS traffic with Resolverns Via HTTPS (Doh) was set up that DOH can massively concentrate the DNS traffic worldwide. If browser providers such as Mozilla or Google deliver their DOH implementations preconfigured in the browsers, and do DOH ex factory factory, you can make the DNS resolver entered in the browser settings for the focus point for DNS requests. Experience does not show that the coarse part of the browser users barely or not exceed DNS settings or not at all occurrently. Arkko praised last week that the IETF Group Adaptive DNS Discovery has understood this problem, and working on decentralization measures.

However, the watch can hardly be stopped. Additionally new security or reputation mechanisms only plays the gross in the hands. Email standards such as DMARC are an example of this and new ideas like one "DMARC for DNS" With additional DNS error messages, therefore just critically beautifies.

Many experts complained in the discussion that it is not foreseeable what effects the implementation of well-intentional, secure standards. The Web, for example, have proven to be a strongly centralized force, said Fastly developer and Http Papst Mark Nottingham. "There was the web if we had excluded centralizing effects", asked he.


What to do, the ideas go apart in the IETF. A consistent standardization under the aspect "End userouveranitat" recommended, among other things, Philipp Hallam-Baker, scientificentist at Comodo. And Nottingham Council to better explain the technical effects to the active competition huers.

The IETF must certainly internally raise awareness of the relationship between standards and unpounded concentration sharp, conceded Arkko.

This could help guidelines for dealing with the problem technically and organized. These could be recorded in an RFC. Daruber should soon exchange a workshop or even a working group.

Like this post? Please share to your friends:
Leave a Reply

;-) :| :x :twisted: :smile: :shock: :sad: :roll: :razz: :oops: :o :mrgreen: :lol: :idea: :grin: :evil: :cry: :cool: :arrow: :???: :?: :!: