Visible to the public Biblio

Filters: Keyword is Software-De fined Networking  [Clear All Filters]
2017-07-18
Jiaqi Yan, Illinois Institute of Technology, Xin Liu, Illinois Institute of Technology, Dong Jin, Illinois Institute of Technology.  2017.  Simulation of a Software-Defined Network as One Big Switch. ACM SIGSIM Conference on Principles of Advanced Discrete Simulation (ACM SIGSIM PADS).

Software-defined networking (SDN) technology promises centralized and rapid network provisioning, holistic management, low operational cost, and improved network visibility. Researchers have developed multiple SDN simulation and emulation platforms to expedite the adoption of many emerging SDN-based applications to production systems. However, the scalability of those platforms is often limited by the underlying physical hardware resources, which inevitably affects the simulation delity in large-scale network settings. In this paper, we present a model abstraction technique that e ectively transforms the network devices in an SDN-based network to one virtualized switch model. While signi cantly reducing the model execution time and enabling the real-time simulation capability, our abstracted model also preserves the end-to-end forwarding behavior of the original network. To achieve this, we first classify packets with the same forwarding behavior into smaller and disjoint Equivalence Classes (ECes) by analyzing the OpenFlow rules installed on the SDN devices. We then create a graph model representing the forwarding behavior of each EC. By traversing those graphs, we nally construct the rules of the big-switch model to e ectively preserve the original network's end-to-end forwarding behavior. Experimental results demonstrate that the network forwarding logic equivalence is well preserved between the abstracted model and the original SDN network. The model abstraction process is fast, e.g., 3.15 seconds to transform a medium-scale tree network consisting of 53,260 rules. The big-switch model is able to speed up the simulation by 4.3 times in average and up to 6.69 times among our evaluation experiments.