Davide Villa

PhD Candidate

Education

  • Ph.D. in Computer Engineering - Northeastern University (Current)
  • M.Sc. in Embedded Computing Systems - Sant’Anna School of Advanced Studies & University of Pisa (2018)
  • B.Sc. in Computer Engineering - University of Pisa (2015)

Research Interests

  • Open Radio Access Network (O-RAN)
  • 5G and beyond Cellular Networks
  • Software-defined Networking
  • Channel Characterization

Bio

Davide is a Ph.D. candidate in Computer Engineering at the Institute for the Wireless Internet of Things at Northeastern University, under Prof. Tommaso Melodia. He received his B.S. in Computer Engineering and his M.S. in Embedded Computing Systems cum Laude from University of Pisa and Sant’Anna School of Advanced Studies in 2015 and 2018, respectively. From 2018 till 2020, he worked as a Research Scientist in the embedded systems and networks group at Raytheon Technologies (former United Technologies Research Center) in Cork, Ireland. His research interests focus on 5G and beyond cellular networks, O-RAN, channel characterization, and software-defined networking for experimental wireless testbeds.

Publications

The development of 6G wireless technologies is rapidly advancing, with the 3rd Generation Partnership Project (3GPP) entering the pre-standardization phase and aiming to deliver the first specifications by 2028. This paper explores the OpenAirInterface (OAI) project, an open-source initiative that plays a crucial role in the evolution of 5G and the future 6G networks. OAI provides a comprehensive implementation of 3GPP and O-RAN compliant networks, including Radio Access Network (RAN), Core Network (CN), and software-defined User Equipment (UE) components. The paper details the history and evolution of OAI, its licensing model, and the various projects under its umbrella, such as RAN, the CN, as well as the Operations, Administration and Maintenance (OAM) projects. It also highlights the development methodology, Continuous Integration/Continuous Delivery (CI/CD) processes, and end-to-end systems powered by OAI. Furthermore, the paper discusses the potential of OAI for 6G research, focusing on spectrum, reflective intelligent surfaces, and Artificial Intelligence (AI)/Machine Learning (ML) integration. The open-source approach of OAI is emphasized as essential for tackling the challenges of 6G, fostering community collaboration, and driving innovation in next-generation wireless technologies.

Link

5G and beyond cellular systems embrace the disaggregation of Radio Access Network (RAN) components, exemplified by the evolution of the fronthual (FH) connection between cellular baseband and radio unit equipment. Crucially, synchronization over the FH is pivotal for reliable 5G services. In recent years, there has been a push to move these links to an Ethernet-based packet network topology, leveraging existing standards and ongoing research for Time-Sensitive Networking (TSN). However, TSN standards, such as Precision Time Protocol (PTP), focus on performance with little to no concern for security. This increases the exposure of the open FH to security risks. Attacks targeting synchronization mechanisms pose significant threats, potentially disrupting 5G networks and impairing connectivity. In this paper, we demonstrate the impact of successful spoofing and replay attacks against PTP synchronization. We show how a spoofing attack is able to cause a production-ready O-RAN and 5G-compliant private cellular base station to catastrophically fail within 2 seconds of the attack, necessitating manual intervention to restore full network operations. To counter this, we design a Machine Learning (ML)-based monitoring solution capable of detecting various malicious attacks with over 97.5% accuracy.

Link

Network slicing allows Telecom Operators (TOs) to support service provisioning with diverse Service Level Agreements (SLAs). The combination of network slicing and Open Radio Access Network (RAN) enables TOs to provide more customized network services and higher commercial benefits. However, in the current Open RAN community, an open-source end-to-end slicing solution for 5G is still missing. To bridge this gap, we developed ORANSlice, an open-source network slicing-enabled Open RAN system integrated with popular open-source RAN frameworks. ORANSlice features programmable, 3GPP-compliant RAN slicing and scheduling functionalities. It supports RAN slicing control and optimization via xApps on the near-real-time RAN Intelligent Controller (RIC) thanks to an extension of the E2 interface between RIC and RAN, and service models for slicing. We deploy and test ORANSlice on different O-RAN testbeds and demonstrate its capabilities on different use cases, including slice prioritization and minimum radio resource guarantee.

Link

The next generation of cellular networks will be characterized by openness, intelligence, virtualization, and distributed computing. The Open Radio Access Network (Open RAN) framework represents a significant leap toward realizing these ideals, with prototype deployments taking place in both academic and industrial domains. While it holds the potential to disrupt the established vendor lock-ins, Open RAN's disaggregated nature raises critical security concerns. Safeguarding data and securing interfaces must be integral to Open RAN's design, demanding meticulous analysis of cost/benefit tradeoffs. In this paper, we embark on the first comprehensive investigation into the impact of encryption on two pivotal Open RAN interfaces: the E2 interface, connecting the base station with a near-real-time RAN Intelligent Controller, and the Open Fronthaul, connecting the Radio Unit to the Distributed Unit. Our study leverages a full-stack O-RAN ALLIANCE compliant implementation within the Colosseum network emulator and a production-ready Open RAN and 5G-compliant private cellular network. This research contributes quantitative insights into the latency introduced and throughput reduction stemming from using various encryption protocols. Furthermore, we present four fundamental principles for constructing security by design within Open RAN systems, offering a roadmap for navigating the intricate landscape of Open RAN security.

Link

Accurate channel modeling in real-time faces remarkable challenge due to the complexities of traditional methods such as ray tracing and field measurements. AI-based techniques have emerged to address these limitations, offering rapid, precise predictions of channel properties through ground truth data. This paper introduces an innovative approach to real-time, high-fidelity propagation modeling through advanced deep learning. Our model integrates 3D geographical data and rough propagation estimates to generate precise path gain predictions. By positioning the transmitter centrally, we simplify the model and enhance its computational efficiency, making it amenable to larger scenarios. Our approach achieves a normalized Root Mean Squared Error of less than 0.035 dB over a 37,210 square meter area, processing in just 46 ms on a GPU and 183 ms on a CPU. This performance significantly surpasses traditional high-fidelity ray tracing methods, which require approximately three orders of magnitude more time. Additionally, the model's adaptability to real-world data highlights its potential to revolutionize wireless network design and optimization, through enabling real-time creation of adaptive digital twins of real-world wireless scenarios in dynamic environments.

Link

Wireless network emulators are being increasingly used for developing and evaluating new solutions for Next Generation (NextG) wireless networks. However, the reliability of the solutions tested on emulation platforms heavily depends on the precision of the emulation process, model design, and parameter settings. To address, obviate, or minimize the impact of errors of emulation models, in this work, we apply the concept of Digital Twin (DT) to large-scale wireless systems. Specifically, we demonstrate the use of Colosseum, the world's largest wireless network emulator with hardware-in-the-loop, as a DT for NextG experimental wireless research at scale. As proof of concept, we leverage the Channel emulation scenario generator and Sounder Toolchain (CaST) to create the DT of a publicly available over-the-air indoor testbed for sub-6 GHz research, namely, Arena. Then, we validate the Colosseum DT through experimental campaigns on emulated wireless environments, including scenarios concerning cellular networks and jamming of Wi-Fi nodes, on both the real and digital systems. Our experiments show that the DT is able to provide a faithful representation of the real-world setup, obtaining an average similarity of up to 0.987 in throughput and 0.982 in Signal to Interference plus Noise Ratio (SINR).

Link

As Fifth generation (5G) cellular systems transition to softwarized, programmable, and intelligent networks, it becomes fundamental to enable public and private 5G deployments that are (i) primarily based on software components while (ii) maintaining or exceeding the performance of traditional monolithic systems and (iii) enabling programmability through bespoke configurations and optimized deployments. This requires hardware acceleration to scale the Physical (PHY) layer performance, programmable elements in the Radio Access Network (RAN) and intelligent controllers at the edge, careful planning of the Radio Frequency (RF) environment, as well as end-to-end integration and testing. In this paper, we describe how we developed the programmable X5G testbed, addressing these challenges through the deployment of the first 8-node network based on the integration of NVIDIA Aerial RAN CoLab (ARC), OpenAirInterface (OAI), and a near-real-time RAN Intelligent Controller (RIC). The Aerial Software Development Kit (SDK) provides the PHY layer, accelerated on Graphics Processing Unit (GPU), with the higher layers from the OAI open-source project interfaced with the PHY through the Small Cell Forum (SCF) Functional Application Platform Interface (FAPI). An E2 agent provides connectivity to the O-RAN Software Community (OSC) near-real-time RIC. We discuss software integration, the network infrastructure, and a digital twin framework for RF planning. We then profile the performance with up to 4 Commercial Off-the-Shelf (COTS) smartphones for each base station with iPerf and video streaming applications, measuring a cell rate higher than 500 Mbps in downlink and 45 Mbps in uplink.

Link

Wireless Sensor Networks (WSNs) are pivotal in various applications, including precision agriculture, ecological surveillance, and the Internet of Things (IoT). However, energy limitations of battery-powered nodes are a critical challenge, necessitating optimization of energy efficiency for maximal network lifetime. Existing strategies like duty cycling and Wake-up Radio (WuR) technology have been employed to mitigate energy consumption and latency, but they present challenges in scenarios with sparse deployments and short communication ranges. This paper introduces and evaluates the performance of Unmanned Aerial Vehicle (UAV)-assisted mobile data collection for WuR-enabled WSNs through physical and simulated experiments. We propose two one-hop UAV-based data collection strategies: a naïve strategy, which follows a predetermined fixed path, and an adaptive strategy, which optimizes the collection route based on recorded metadata. Our evaluation includes multiple experiment categories, measuring collection reliability, collection cycle duration, successful data collection time (latency), and node awake time to infer network lifetime. Results indicate that the adaptive strategy outperforms the naïve strategy across all metrics. Furthermore, WuR-based scenarios demonstrate lower latency and considerably lower node awake time compared to duty cycle-based scenarios, leading to several orders of magnitude longer network lifetime. Remarkably, our results suggest that the use of WuR technology alone achieves unprecedented network lifetimes, regardless of whether data collection paths are optimized. This underscores the significance of WuR as the technology of choice for all energy critical WSN applications.

Link

The transition of fifth generation (5G) cellular systems to softwarized, programmable, and intelligent networks depends on successfully enabling public and private 5G deployments that are (i) fully software-driven and (ii) with a performance at par with that of traditional monolithic systems. This requires hardware acceleration to scale the Physical (PHY) layer performance, end-to-end integration and testing, and careful planning of the Radio Frequency (RF) environment. In this paper, we describe how the X5G testbed at Northeastern University has addressed these challenges through the first 8-node network deployment of the NVIDIA Aerial RAN CoLab (ARC), with the Aerial Software Development Kit (SDK) for the PHY layer, accelerated on Graphics Processing Unit (GPU), and through its integration with higher layers from the OpenAirInterface (OAI) open-source project through the Small Cell Forum (SCF) Functional Application Platform Interface (FAPI). We discuss software integration, the network infrastructure, and a digital twin framework for RF planning. We then profile the performance with up to 4 Commercial Off-the-Shelf (COTS) smartphones for each base station with iPerf and video streaming applications, measuring a cell rate higher than 500 Mbps in downlink and 45 Mbps in uplink.

Link

Recent years have witnessed the Open Radio Access Network (RAN) paradigm transforming the fundamental ways cellular systems are deployed, managed, and optimized. This shift is led by concepts such as openness, softwarization, programmability, interoperability, and intelligence of the network, which have emerged in wired networks through Software-defined Networking (SDN) but lag behind in cellular systems. The realization of the Open RAN vision into practical architectures, intelligent data-driven control loops, and efficient software implementations, however, is a multifaceted challenge, which requires (i) datasets to train Artificial Intelligence (AI) and Machine Learning (ML) models; (ii) facilities to test models without disrupting production networks; (iii) continuous and automated validation of the RAN software; and (iv) significant testing and integration efforts. This paper is a tutorial on how Colosseum—the world’s largest wireless network emulator with hardware in the loop—can provide the research infrastructure and tools to fill the gap between the Open RAN vision, and the deployment and commercialization of open and programmable networks. We describe how Colosseum implements an Open RAN digital twin through a high-fidelity Radio Frequency (RF) channel emulator and endto- end softwarized O-RAN and 5G-compliant protocol stacks, thus allowing users to reproduce and experiment upon topologies representative of real-world cellular deployments. Then, we detail the twinning infrastructure of Colosseum, as well as the automation pipelines for RF and protocol stack twinning. Finally, we showcase a broad range of Open RAN use cases implemented on Colosseum, including the real-time connection between the digital twin and real-world networks, and the development, prototyping, and testing of AI/ML solutions for Open RAN.

Link

The ever-growing number of wireless communication devices and technologies demands spectrum-sharing techniques. Effective coexistence management is crucial to avoid harmful interference, especially with critical systems like nautical and aerial radars in which incumbent radios operate missioncritical communication links. In this demo, we showcase a framework that leverages Colosseum, the world’s largest wireless network emulator with hardware-in-the-loop, as a playground to study commercial radar waveforms coexisting with a cellular network in CBRS band in complex environments. We create an ad-hoc high-fidelity spectrum-sharing scenario for this purpose. We deploy a cellular network to collect IQ samples with the aim of training an ML agent that runs at the base station. The agent has the goal of detecting incumbent radar transmissions and vacating the cellular bandwidth to avoid interfering with the radar operations. Our experiment results show an average detection accuracy of 88%, with an average detection time of 137 ms.

Link

Because of the ever-growing amount of wireless consumers, spectrum-sharing techniques have been increasingly common in the wireless ecosystem, with the main goal of avoiding harmful interference to coexisting communication systems. This is even more important when considering systems, such as nautical and aerial fleet radars, in which incumbent radios operate mission-critical communication links. To study, develop, and validate these solutions, adequate platforms, such as the Colosseum wireless network emulator, are key as they enable experimentation with spectrum-sharing heterogeneous radio technologies in controlled environments. In this work, we demonstrate how Colosseum can be used to twin commercial radio waveforms to evaluate the coexistence of such technologies in complex wireless propagation environments. To this aim, we create a high-fidelity spectrum-sharing scenario on Colosseum to evaluate the impact of twinned commercial radar waveforms on a cellular network operating in the CBRS band. Then, we leverage IQ samples collected on the testbed to train a machine learning agent that runs at the base station to detect the presence of incumbent radar transmissions and vacate the bandwidth to avoid causing them harmful interference. Our results show an average detection accuracy of 88%, with accuracy above 90% in SNR regimes above 0 dB and SINR regimes above --20 dB, and with an average detection time of 137 ms.

Link

Large-scale wireless testbeds are being increasingly used in developing and evaluating new solutions for next generation wireless networks. Among others, high-fidelity FPGA-based emulation platforms have unique capabilities for faithfully modeling real-world wireless environments in real-time and at scale, while guaranteeing repeatability. However, the reliability of the solutions tested on emulation platforms heavily depends on the precision of the emulation process, which is often overlooked. To address this unmet need in wireless network emulator-based experiments, in this paper we present CaST, a Channel emulation generator and Sounder Toolchain for creating and characterizing realistic wireless network scenarios with high accuracy. CaST consists of (i) a framework for creating mobile wireless scenarios from ray-tracing models for FPGA-based emulation platforms, and (ii) a containerized Software Defined Radio-based channel sounder to precisely characterize the emulated channels. We demonstrate the use of CaST by designing, deploying and validating multi-path mobile scenarios on Colosseum, the world's largest wireless network emulator. Results show that CaST achieves ≤ 20 ns accuracy in sounding Channel Impulse Response tap delays, and 0.5 dB accuracy in measuring tap gains.

Link

Colosseum is an open-access and publicly-available large-scale wireless testbed for experimental research via virtualized and softwarized waveforms and protocol stacks on a fully programmable, “white-box” platform. Through 256 state-of-the-art software-defined radios and a massive channel emulator core, Colosseum can model virtually any scenario, enabling the design, development and testing of solutions at scale in a variety of deployments and channel conditions. These Colosseum radio-frequency scenarios are reproduced through high-fidelity FPGAbased emulation with finite-impulse response filters. Filters model the taps of desired wireless channels and apply them to the signals generated by the radio nodes, faithfully mimicking the conditions of real-world wireless environments. In this paper, we introduce Colosseum as a testbed that is for the first time open to the research community. We describe the architecture of Colosseum and its experimentation and emulation capabilities. We then demonstrate the effectiveness of Colosseum for experimental research at scale through exemplary use cases including prevailing wireless technologies (e.g., cellular and Wi-Fi) in spectrum sharing and unmanned aerial vehicle scenarios. A roadmap for Colosseum future updates concludes the paper.

Link

Wireless networks are ubiquitous in our modern world, and we rely more and more on their continuous and reliable operation for battery-powered devices. Networks that self-maintain and self-heal are inherently more reliable. We study efficient and effective network self-healing and update methods for routing recovery following routing failures in a wireless multi-hop network. Network update processes are important since they enable local nodes to maintain the latest and updated neighbor information for routing given the network changes caused by failures. Network update also introduces control signals overhead. In this paper, we investigate the trade-off between routing performance and overhead cost with different network update algorithms and we characterize the performance of the proposed algorithms using network simulations. We show that network updates have positive impacts on routing. In particular, the on-demand route update method provides better results among compared techniques. The improvement is varying depending on the network topology and failure condition scenario.

Link