[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

April 2000 COOK Report published on DDoS and Ideas of E. Gerck(FWD)




>Understanding Distributed Denial of Service pp. 1 - 16
>
>During the second week of February the largest, and most diverse
>denial of service attacks in the history of the Internet caught
>several of the most important commercial web sites off guard and
>exposed what was previously a largely unsuspected operational
>vulnerability that affects the entire commercial Internet. --  Just
>as after Reagan was shot Al Haig stepped forward to say 'don't worry
>we're in charge here, we contend that Gene Spafford's February 19th
>summation of the White House meeting provides a soothing but
>superficial explanation of what is really a far more subtle and
>difficult structural weakness. This weakness is apparently inherent
>in the basic structure of the Internet and cannot be "enforced" out
>of existence.  We present in Narrative form the NANOG and IETF
>technical discussions that resulted from the attacks.  The discussion
>demonstrates that Internet backbone engineers are by no means agreed
>on precisely what happened or on how to deal with it.
>
>On February 9, Lauren Weinstein, partner to Peter G. Neumann of the
>Risks mail list  and co-sponsor with Neumann of People for Internet
>Responsibility had the following observation. "It seems apparent that
>the rush to move all manner of important or even critical commercial,
>medical, government, and other applications onto the Internet and Web
>has far outstripped the underlying reality of the existing Internet
>infrastructure.  Compared with the overall robustness of the U.S.
>telephone system, the Internet is a second-class citizen when it
>comes to these kinds of vulnerabilities. Nor will simply throwing
>money at the Internet necessarily do much good in this regard. More
>bandwidth, additional servers, and faster routers--they'd still be
>open to sophisticated (and even not so sophisticated) attacks which
>could be triggered from one PC anywhere in the world. In the long
>run, major alterations will be needed in the fundamental structure of
>the Internet to even begin to get a handle on these sorts of
>problems, and a practical path to that goal still remains fuzzy at
>this time."
>
>Ed Gerck's Ideas pp. 17- 22, 30
>
>Part Two of this issue contains an interview with Ed Gerck as well as
>two essays by him. He is co-founder of the Meta Certificate Group,
>http://mcg.org.br  , CEO of Safevote, Inc. and Chairman of the IVTA..
>We suggest that his ideas form the basis for a fresh and compelling
>analysis of what we may really be dealing with.  We conclude that
>there is a possibility that the fundamental nature of the attacks may
>have been completely misunderstood.  We also contend that Gerck's
>theories, published here for the first time, may provide an entirely
>different mathematical basis for understanding the Internet as a
>quantum information structure possessing significantly different
>capabilities and potentials than could be extrapolated from our
>current understanding. Although this is quite a statement to make,
>his ideas have reached enough people so that it is likely that
>research will be rapidly undertaken to ascertain if his own
>experimental results dating from 1998 are verifiable and
>reproducible. Gerck's ideas involve the foundation of an entirely new
>calculus for the operation of the Internet.
>
>Gerck asserts that the major reason the attacks were so successful is
>that the packets arrived at the target servers with a high degree of
>coherency - that is to say at almost the same instant.  He points out
>that the technical functionality of the Internet mitigates against
>the coherent arrival of large numbers of packets at a specific target
>and thus a ten fold spike in incoming bandwidth would be very
>unlikely unless other unusual mechanisms are also at play."
>
>How then could the observed effects of the arrival of very large
>numbers of packets have happened?  He explains how his work in the
>quantum mechanics of lasers in the early 1980s gave him a hypothesis
>that he successfully tested in a university environment in 1998.
>Namely he suggests that the number of entities in the Internet has
>reached a critical mass where a single event such as a packet sent to
>a trin00 network, can result in an avalanche of coherent data
>amplification.  The result is similar to the coherent amplification
>process that sets off the sudden flash of a laser. Under such
>conditions he posits that when this occurs, it creates conditions
>where packets can provide for a much different behavior as they reach
>a target. Gerck suggests that such events trigger a kind of quantum
>behavior, which however always exists but which then becomes visible
>at the user observed level and strongly contrasts with the classical
>behavior that it has replaced."
>
>Gerck's ideas represent a paradigmatic shift in the evaluation of the
>scope, function and behavior of the Internet. One of the problems of
>communications involved is that to those stuck in the old paradigm,
>messages defining the new are often unintelligible. For many people
>his ideas will be quite jarring.
>
>For example, his ideas reach to the root of what we call data. He
>suggests that data be thought of in terms of a natural quantity and
>as something that can be modeled with absorption, spontaneous
>emission and stimulated emission processes -- the last being a
>behavior associated with quantum systems. He finds that under certain
>conditions, stimulated data emission can win out over spontaneous
>data emission. This will happen when a minimum threshold of affected
>systems is disturbed by what may be a hacker attack, or the
>interaction of a virus with multiple systems or even by the
>unexpected appearance of a bug in operating software that everyone
>assumes to be stable. His findings lead to the conclusion that such
>perturbations, resulting in web site and or network congestion, will
>happen with increasing frequency. Of course if he is right, when they
>do happen the next time, they may have absolutely nothing to do with
>hackers.
>
>After compiling the technical discussion from NANOG and IETF, it
>seems to us that the emphasis on traditional security measures is
>rather futile.  The Internet is too large with too many machines
>under too many levels of control for traditional security measures of
>confinement of people and machines to be effective.
>
>Gerck has some very interesting ideas about constructing mechanisms
>where two parties which are not known to each other may use a third
>neutral environment in which to securely negotiate conditions of
>trusted operation.  He seems to have an uncanny sense of political
>power and psychology and how to reflect this in technical situations
>to build trust between parties that have no common grounds for
>negotiation.
>
>As recently as a week ago we intended to publish only his two essays.
>However when we called him on the 25th of February to ask for answers
>to questions about the second essay on coherency, we found ourselves
>in the midst of a far ranging discussion that opened up some of his
>ideas of the physics of data and mechanics of trust that we had not
>heard before.  This discussion lead to the interview on pages 17 to
>23.  This interview which we have further expanded by asking several
>of our own experts to read and ask their own questions of Ed, begins
>to thrown some light on the breadth and scope of his ideas.
>
>Gerck's ideas lead to a paradigm change on such fundamental questions
>as data flow in the internet and the nature of security and trust in
>computer networking. Having a world view different from the
>prevailing gestalt often presents problems for everyone involved. We
>invite readers to ponder his message. We have known of Ed for perhaps
>almost two years and known him directly for six months. An unusual
>quality about him is that he is laid back. He is intuitive and
>skillful in dealing with people.  His ideas may succeed precisely
>because he doesn't push too hard.
>
>We have been a bit gun shy about walking out on the end of a limb on
>behalf of the ideas of someone who is not yet well known and whose
>views are so iconoclastic.  For the last few weeks we have made some
>serious efforts to get some sanity checks from people in better
>positions than we are to judge what he presents.  Three very senior
>people have returned thumbs up.  We introduced a forth such person
>with the strongest technical background of all to Gerck two weeks ago.
>
>When we asked this person how we might describe Gerck in this
>newsletter he replied:  You might describe him as one of those bright
>people who are so frequently overlooked because he's happier working
>on hard problems than talking about it all. You might describe him as
>an Internet Guy who got here "the hard way" -- He's trained as a
>physicist. He thinks about the world from a perspective of how do you
>model the stuff you perceive around you in mathematical terms -- and
>this leads him to different observations than those made by those of
>us who "grew up" in the Internet and distributed computing in
>general."
>
>One of the problems facing the Internet, is that we have, sometimes
>with chewing gum and bailing wire, built it into something on which a
>very large proportion of our economy is riding. The prevailing
>opinion in the wake of the DDoS attacks is to call in law
>enforcement, build the security walls ever higher and hunker down
>with publicly reassuring words to the effect of don't worry we are in
>charge here. A careful reading of the technical discussion on pages 2
>through 16 of this issue will show the that this position is founded
>on quicksand.  A reading of the Gerck essays and interview will
>reinforce this conclusion
>
>We contend that the official views issued in the aftermath of the
>White House meeting of February may be well-intentioned.
>Nevertheless they are misguided. Without a correct diagnosis of our
>current problems, we will be unlikely to find solutions.  As a
>result, the Internet's behavior of early February may become more
>rather than less commonplace.
>
>Essays, pp. 23- 27
>
>Thinking
>
>We present roughly half of Ed Gerck's Thinking Essay in the belief
>that readers will begin to understand why we consider it the single
>best short essay on the topic of information control, DNS Governance
>and ICANN ever written.
>
>"...there is nothing to be gained by opposing ICANN, because ICANN is
>just the overseer of problems to which we need a solution.
>
>My point is that there is something basically wrong with the DNS and
>which precludes a fair solution - as I intend to show in the
>following text, the DNS design has a single handle of control which
>becomes its single point of failure. This needs to be overcome with
>another design, under a more comprehensive principle, but one which
>must also be backward-compatible with the DNS. [. . . .]
>
>So, the subject is domain names.  The subject could also be Internet
>voting. But I will leave voting aside for a while. In my opinion, the
>subject, in a broader sense, is information control. If domain names
>could not be used for information control (as they can now by default
>under the DNS - see below), I posit that we would not have any
>problems with domain names.
>
>But, domain names provide even more than mere information control -
>they provide for a single handle of control. DNS name registration is
>indeed the single but effective handle for information control in the
>Internet. No other handle is possible because: (1) there is no
>distinction in the Internet between information providers and users
>(e.g., as the radio spectrum is controlled); (2) there is no easily
>defined provider liability to control the dissemination of
>information (e.g., as advertisement and trademarks are controlled);
>(3) there is no user confinement to control information access (e.g.,
>as state or country borders in the Canadian Homolka case), etc.
>
>But, how did we end up in this situation? After all, the Internet was
>founded under the idea of denying a single point of control - which
>can be seen also as a single point of failure. The problem is that
>certain design choices in the evolution of the DNS, made long ago,
>have made users fully dependent on the DNS for certain critical
>Internet services.  These design choices further strengthened the
>position of DNS name registration as the single handle of information
>control in the Internet. And, in the reverse argument, as its single
>point of failure.  [. . . .]
>
>However, without the DNS there is no email service, search engines do
>not work, and web page links fail. Since email accounts for perhaps
>30% of Internet traffic - an old figure, it may be more nowadays -
>while search engines and links from other sites allow people to find
>out about web sites in about 85% of the cases (for each type, see
>http://www.mmgco.com/welcome/ ) I think it is actually an
>understatement to call the DNS a "handle."  The DNS is the very face,
>hands and feet of the Internet. It is the primary interface for most
>users - that which people "see". Its importance is compounded by the
>"inertia" of such a large system to change. Any proposal to change
>the DNS, or BIND nameservers, or the DNS resolvers in browsers in any
>substantial way would be impractical.
>
>[. . . .] One of other fallacies in email is to ask the same system
>you do not trust (DNS, with the in-addr.arpa kludge) to check the
>name you do not trust (the DNS name), when doing an IP-check on a DNS
>name. There are more problems and they have just become more acute
>with the need to stop spam. Now administrators have begun to do a
>reverse DNS check by default.  Under such circumstances you MUST have
>both DNS and IP.
>
>Further, having witnessed the placing of decisions of network address
>assignment (IP numbers) together with DNS matters under the ruling of
>one private policy-setting company (ICANN), we see another example of
>uniting and making everything depend on what is, by design, separate.
>The needs of network traffic (IP) are independent of the needs of
>user services (DNS). They also serve different goals, and different
>customers. One is a pre-defined address space which can be
>bulk-assigned and even bulk-owned (you may own the right to use one
>IP, but not the right to a particular IP), the other is a much larger
>and open-ended name space which cannot be either bulk-assigned or
>bulk-owned. They do not belong together - they should not be treated
>together.
>
>But, there are other examples. In fact, my full study conducted with
>participation of Einar Stefferud and others has so far catalogued
>more than forty-one essential problems caused by the current design
>of the DNS. Thus, a solution to current user wants is not to be
>reached simply by answering "on what" and "by whom" control is to be
>exerted, as presently done in all such discussions, without exception
>- for example, those led by ICANN. In this view, ICANN is not even
>the problem (as usually depicted by many) but simply the overseer of
>problems. At least, of 41+ main problems - all of which involve
>information control.
>
>Thus by realizing both what these 41 and other problems are and the
>underlying issue of information control in the Internet (which issue
>is not ignored by governments), the study intended to lay the
>groundwork to provide for a collaborative solution to information
>flow in the Internet without the hindrance of these 41+ problems. The
>study also intends that the possibility of information control will
>be minimized as a design goal.   [. . . .]
>
>Regarding "time" - readers may ask what is the schedule to propose
>new standards based on what I and my group are working on for domain
>names? As I see it and as I also comment in regard to the work on
>advancing standards for Internet voting at the IVTA (where IMO the
>same principles apply), time is not a trigger for the events needed
>to get us out of our predicament, but understanding is. Cooperation
>has its own dynamics and we must allow for things to gel, naturally.
>We can motivate, we can be proactive but we must not be dominating.
>We seek collaboration, not domination. Both technically as well as
>market-wise."
>
>Coherent Effects in Internet Security and Traffic
>
>Here is a paragraph from Gerck's second essay.
>
>"This was not only a DDoS - this was a CDoS. A Coherent Denial of
>Service attack. The difference is that a distributed but incoherent
>attack would not have done any major harm. In order to explain how
>such an attack was possible and why it was effective, one needs to
>understand first that, normally nothing is coherent in the Internet.
>All packets travel from source to destination in what may seem to be
>a random fashion; each host has unsynchronized time - oftentimes,
>even wrong time zones; and even the path traveled by each packet is
>also non-deterministic. Thus, achieving the coherent arrival of a
>stream of packets at one location by sending them from a large number
>of coordinated locations is a feat.
>
>****************************************************************
>The COOK Report on Internet      Index to 8 years of the COOK  Report
>431 Greenway Ave, Ewing, NJ 08618 USA  http://cookreport.com
>(609) 882-2572 (phone & fax)             Battle for Cyberspace: How
>cook@cookreport.com                     Crucial Technical . . . - 392 pages
>just published. See  http://cookreport.com/ipbattle.shtml
>****************************************************************
>