IEEE Global Communications Conference
9-13 December 2018 // Abu Dhabi, UAE
Gateway to a Connected World

Tutorials

Sunday, 9 December, 09:00-12:30

TUT01: Deep Learning for Communications
TUT02: Signal Processing for Big Data Analytics: Fundamental and Applications
TUT03: Embracing Non-Orthogonal Multiple Access in Future Wireless Networks
TUT04: Integrating UAVs into Cellular: Enabling Technologies and Research Challenges
TUT05: Blockchain for Cyberphysical Systems: Applications, Opportunities and Challenges
TUT06: Fronthauling/Backhauling and Core Network Evolution Towards 5G and Beyond

Sunday, 9 December, 14:00-17:30

TUT07: Machine Learning and Artificial Intelligence in Wireless Networks: Challenges and Opportunities
TUT08: Hybrid Beamforming for 5G Millimeter Wave Systems
TUT09: Interactive Learning for Optimizing IoT Management
TUT10: Molecular Communication: Methods, Simulations, and Experiments
TUT11: Optimization and Economics of Mobile Crowd Sensing
TUT12: Towards Convergent IoT proliferation: Standards and Leading Architectures

Thursday, 13 December, 09:00-12:30

TUT13: Signal Processing and Optimization Techniques for Ultra Reliable Low Latency Communications (URLLC)
TUT14: UAV Cellular Communications: Practical Insights and Future Vision
TUT15: Wireless Radio Access for 5G and Beyond
TUT16: Vertical-Oriented E2E Network Slicing and Orchestration: Modeling, Optimization, and Implementation
TUT17: Rate-Splitting Multiple Access for Next Generation Wireless Networks: Bridging the Extremes
TUT18: Optical Wireless Communication: Fundamental Limits, New Advances and Future Perspectives

Thursday, 13 December, 14:00-17:30

TUT19: On Network Slicing and Network Softwarisation in 5G Mobile Systems
TUT20: Short-Packet Communications: Fundamentals and Practical Coding Schemes
TUT21: Interference Management in Wireless Networks: Fundamental Bounds and the Role of Cooperation (Cancelled)
TUT22: Massive Wireless Networks in 5G and Beyond: Spatiotemporal Modeling and Design
TUT23: Connected Vehicles in the 5G Era
TUT24: Mobile Edge Computing for Internet of Things (Cancelled)


Sunday, 9 December, 09:00-12:30

TUT01: Deep Learning for Communications
Room:
Capital Suite 13
Presenters: Jakob Hoydis (Nokia Bell Labs, France); Stephan ten Brink, Sebastian Cammerer and Sebastian Dörner (University of Stuttgart, Germany)

In the last decade, deep learning has led to many breakthroughs in various domains, such as computer vision, natural language processing, and speech recognition. Motivated by these successes, researchers all over the world have recently started to investigate applications of this tool to their respective domain of expertise, with communications being one of them. The goal of this tutorial is to provide an introduction to deep learning that will enable the attendees to identify potential applications in their own research field. We give an overview of the very rapidly growing body of literature, explain state-of-the-art neural network architectures and training methods, and go through several promising applications and concepts, such as neural decoding, deep MIMO detection, autoencoders, and information bottleneck. The attendees receive tutorial slides and Jupyter notebooks containing code examples which allows them to quickly get up to speed with this new and exciting field. During the break, we demonstrate the world's first fully neural network-based communications system.

TUT02: Signal Processing for Big Data Analytics: Fundamental and Applications
Room: Capital Suite 17
Presenters: Lingyang Song (Peking University, P.R. China); Zhu Han (University of Houston, USA)

Today, scientists, engineers, educators, citizens and decision-makers have unprecedented amounts and types of data available to them. The phrase "Big Data" refers to the kinds of data that challenge existing analytical methods due to size, complexity, or rate of availability. The challenges in managing and analyzing "Big Data" can require fundamentally new techniques and technologies in order to handle the size, complexity, or rate of availability of these data. At the same time, the advent of big data offers unprecedented opportunities for data-driven discovery and decision-making in virtually every area of human endeavor. On the other hand, mobile data traffic, especially mobile video traffic, has dramatically increased in recent years with the emergence of smart phones, tablets, and various new applications. It is hence crucial to rethink the design and optimization of wireless networks in order to accommodate these bandwidth consuming applications and services with the help of data analytics and machine learning techniques. Realizing the transformative potential of big data requires addressing many challenges in the management of data and knowledge, computational methods for data analysis, and automating many aspects of data-enabled discovery processes. Combinations of computational, mathematical, and statistical techniques, methodologies and theories are needed to enable these advances. Signal processing and systems engineering communities can be important contributors to Big Data research and development, complementing computer and information science-based efforts in this direction. Big Data analytics entail high-dimensional, decentralized, online, and robust statistical signal processing, as well as large, distributed, fault-tolerant, and intelligent systems engineering. There is a need and opportunity for the signals and systems communities to jointly pursue Big Data research and development. The aim of this tutorial is to bring together signal processing engineers, computer and information scientists, applied mathematicians and statisticians, as well as systems engineers to carve out the role that analytical and experimental engineering has to play in Big Data research and development. This proposal will emphasize on signal analytics, networking, computation, optimization, as well as systems engineering aspects of Big Data. There are four main objectives. The first objective is to provide an introduction to the big data paradigm, from the signal processing perspective. The second objective is to introduce the key techniques to enable signal processing for big data in a comprehensive way. The third objective is to provide numerical datasets, and illuminate how signal processing approaches can be addressed to wireless datasets. The fourth objective is to present the state-of-the-art big data applications. This will include classifications of the different schemes and the technical details in each scheme.

TUT03: Embracing Non-Orthogonal Multiple Access in Future Wireless Networks
Room:
Capital Suite 16
Presenters: Zhiguo Ding (Lancaster University, United Kingdom (Great Britain))

Non-orthogonal multiple access (NOMA) has been recognized as a paradigm shift for the design of multiple access techniques in future wireless networks. The essential idea of NOMA is to encourage radio resource sharing, which means that multiple users are served at the same frequency resource blocks and at the same time, instead of allowing a single user to solely occupy one block for a time as in orthogonal multiple access (OMA). As such, NOMA offers better flexibility for utilizing the scarce bandwidth resources and meeting the heterogeneous demands on low latency, high reliability, massive connectivity, improved fairness, and high throughput in future wireless networks. Because of its superior performance, NOMA has attracted a lot of attention from both industry and academia, where the principle of NOMA has been included/proposed to 3GPP-LTE-Advanced, forthcoming 5G New Radio, as well as the next generation digital TV standard. The proposed tutorial will provide a comprehensive treatment of the principles, architectures, and applications of NOMA techniques in future wireless networks. Future research challenges regarding NOMA in future wireless networks are also presented.

TUT04: Integrating UAVs into Cellular: Enabling Technologies and Research Challenges
Room:
Capital Suite 14
Presenters: Rui Zhang (National University of Singapore, Singapore) and Yong Zeng (The University of Sydney, Australia)

The integration of unmanned aerial vehicles (UAVs) into cellular systems calls for a paradigm shift on the design of conventional cellular systems, to enable a highly heterogeneous network architecture with not only terrestrial users and base stations (BSs), but also aerial users and communication platforms at altitude varying from a few meters to even dozens of kilometers. In particular, both frameworks of cellular-connected UAV communications and UAV-assisted wireless communications are significantly different from the conventional terrestrial communications, due to the high altitude and mobility of UAVs, the unique channel characteristics of UAV-ground links, the asymmetric quality of service (QoS) requirements for downlink control and uplink data transmissions, the stringent constraints imposed by the size, weight, and power (SWAP) limitations of UAVs, as well as the additional design degrees of freedom with joint UAV mobility control and communication resource allocation. Significant research efforts from both academia and industry have been devoted to exploring this exciting new field, with remarkable progress made, especially in the past couple of years. The aim of this tutorial is thus to provide a comprehensive overview of the potential applications, networking architectures, the latest research findings and key enabling technologies for integrating UAVs into cellular systems. Major challenges and promising future directions for research will also be highlighted.

TUT05: Blockchain for Cyberphysical Systems: Applications, Opportunities and Challenges
Room:
Capital Suite 15
Presenters: Salil S Kanhere (The University of New South Wales, Australia); Raja Jurdak (CSIRO & University of Queensland, Australia)

In a cyber-physical system (CPS), computing elements coordinate and communicate with sensors, which monitor cyber and physical indicators, and actuators, which modify the cyber and physical environment where they are run. Current CPS ecosystems rely on centralised, brokered communication models, otherwise known as the client-server paradigm. All devices are identified, authenticated and connected through cloud servers and the data collected by the devices is stored in the cloud for further processing. While this model has connected generic computing devices for decades and will continue to support small-scale CPS networks as we see them today, it will not be able to respond to the growing needs of the large-scale CPS ecosystems of tomorrow with billions of connected devices. Cloud servers will remain a bottleneck and point of failure that can disrupt the entire network. This is especially important as critical services and infrastructure such as healthcare, electric grids, logistics, transportation become dependent on CPS. The current stove-piped architecture has also created isolated data silos, where users have limited control over their data and how it is used. In this tutorial, we will explore how Blockchain (BC) technology has the potential to overcome the aforementioned challenges. BC is an immutable timestamp ledger of blocks that is used for storing and sharing data in a distributed manner. In recent years, BC has attracted tremendous attention from practitioners and academics in different disciplines (including law, finance, and computer science) due to its salient features which include distributed structure, immutability and security and privacy. The tutorial will specifically consider three key aspects of CPS which include: (i) Internet of Things; (ii) Intelligent Transportation; and (iii) Supply Chain. We will explain relevant concepts, review the state-of-the-art, present representative solutions that have been proposed and discuss open challenges.

TUT06: Fronthauling/Backhauling and Core Network Evolution Towards 5G and Beyond
Room:
Capital Suite 18
Presenters: George K. Karagiannidis (Aristotle University of Thessaloniki, Greece); Koralia N. Pappi (Aristotle University of Thessaloniki & Intracom S.A. Telecom Solutions, Greece); Panagiotis D. Diamantoulakis (Aristotle University of Thessaloniki, Greece)

The need for increasing data capacity and coverage and the 5G vision for improved end-to-end performance, low latency, and low energy consumption has altered the traditional architecture of mobile networks, leading to a new era of cloud radio, network function virtualization, and central management of a common pool of resources. The benefits brought by the research advances in the radio access technology cannot be utilized in full, without state-of-the-art backhauling techniques and a radical redesign of the packet core network. This tutorial will summarize: i) the cloud architecture of radio access network (RAN), ii) the most prominent communication techniques for fronthauling and backhauling and their appropriateness according to the network needs, iii) the capabilities, similarities, and differences of Long-Term Evolution (LTE) and 5G core networks, and iv) various end-to-end use cases, depending on application scenarios. The content of this tutorial will be intended for diverse audience, including researchers working on backhauling/fronthauling techniques, cloud radio access networks, and next mobile end-to-end applications, industry peers, such as telecom vendors, operators, and enterprises, industries interested in deploying 5G and IoT services, and telecom engineering students.

Sunday, 9 December, 14:00-17:30

TUT07: Machine Learning and Artificial Intelligence in Wireless Networks: Challenges and Opportunities
Room:
Capital Suite 13
Presenters: Walid Saad (Virginia Tech, USA); Mehdi Bennis (Centre of Wireless Communications, University of Oulu, Finland)

Next-generation wireless networks must support ultra-reliable, low-latency communication and intelligently manage a massive number of Internet of Things (IoT) devices in real-time, within a highly dynamic environment. This need for stringent quality-of-service (QoS) requirements as well as mobile edge and core intelligence can only be realized by integrating fundamental notions of artificial intelligence (AI) and machine learning across the wireless infrastructure and end-user devices. To this end, the goal of this tutorial is to provide a holistic tutorial on machine learning for wireless network design. In particular, we first provide a comprehensive treatment of the fundamentals of machine learning and artificial neural networks, which are one of the most important pillars AI. Then, we introduce a classification of the various types of neural networks that include feed-forward neural networks, recurrent neural networks, spiking neural networks, and deep neural networks. For each type, we provide an introduction on their basic components, their training processes, and a specific example neural network. Then, we overview a broad range of wireless applications that can make use of neural network designs including spectrum management, multiple radio access technology cellular networks, wireless virtual reality, mobile edge computing and caching, drone-based communications, the Internet of Things, and vehicular networks. For each application, we first outline the main rationale for applying machine learning while pinpointing illustrative scenarios and challenges. We complement this overview with a detailed example drawn from the state-of-the-art. Finally, we conclude by shedding light on the future works.

TUT08: Hybrid Beamforming for 5G Millimeter Wave Systems
Room:
Capital Suite 15
Presenters: Jun Zhang (The Hong Kong University of Science and Technology, Hong Kong)

The upcoming 5G network needs to achieve substantially larger link capacity and ultra-low latency to support emerging mobile applications. While conventional techniques have reached their limits, uplifting the carrier frequency to the millimeter wave (mm-wave) band stands out as an effective approach to further boost the network capacity, as it provides orders of magnitude greater spectrum than current cellular bands. Thus mm-wave communication is becoming synonymous with 5G. Large-scale antenna arrays are needed to fully exploit the performance gains of mm-wave communications, which, however, brings formidable challenges to algorithm design and hardware implementation. Conventional fully digital beamforming techniques are inapplicable, as they demand a separate radio frequency (RF) chain for each antenna element. Hybrid beamforming is recently proposed as a cost- effective alternative, which requires a small number of RF chains, and thus can significantly reduce hardware cost and power consumption. Nevertheless, the successful implementation of hybrid beamforming faces a few key challenges. Due to the unit-modulus constraint for the analog component, hybrid beamforming problems are inherently nonconvex, and may induce unaffordable computational complexity. Furthermore, it is of critical importance to further reduce the hardware complexity of hybrid beamforming structures, as mm-wave components are costly and power hungry. Meanwhile, the reduced hardware complexity may introduce substantial performance degradation compared with fully-digital beamforming. Thus the design of hybrid beamforming differs fundamentally from that of the fully digital one. This tutorial will present recent developments in this active area, including effective hardware structures and beamforming algorithms. A holistic approach will be taken, emphasizing on the three decisive aspects: 1) hardware efficiency (HE), i.e., the required hardware components; 2) computational efficiency (CE) of the associated beamforming algorithm; and 3) achievable spectral efficiency (SE). Through systematic comparison, the interplay and tradeoff among the three design aspects will be demonstrated, and promising candidates for hybrid beamforming in 5G mm-wave systems will be identified.

TUT09: Interactive Learning for Optimizing IoT Management
Room:
Capital Suite 14
Presenters: Georgios B. Giannakis and Tianyi Chen (University of Minnesota, USA)

Internet-of-Things (IoT) envisions an intelligent infrastructure of networked smart devices offering task-specific services. IoT features include extreme heterogeneity, ubiquitous devices, and unpredictable dynamics in part due to human participation. In this IoT context, the need arises for foundational innovations in network design and management to allow efficient adaptation to changing environments, and low-cost service provisioning, subject to stringent latency constraints. To this end, the overarching goal of this tutorial is to outline a unifying framework encompassing improved online learning and management policies through contemporary communication, networking, and optimization techniques. From the network design vantage point, the approaches leverage a promising architecture termed fog that enables smart devices to gain proximity access to cloud functionalities at the network edge, along the cloud-to-things continuum. From the management perspective, key innovations will target statistical learning-aided online network management approaches, and their scalable implementation under limited feedback. To further accommodate dynamics, the approaches rely on reinforcement learning empowered with kernel-based function approximators to infer IoT user preferences as well as allocate popular content in distributed storage assets. Impact of the unified framework will be demonstrated through tests featuring much needed reduction in service delay, and enhanced operation robust to non-stationary environments.

TUT10: Molecular Communication: Methods, Simulations, and Experiments
Room:
Capital Suite 16
Presenters: Nariman Farsad (Stanford University, USA); Yansha Deng (King's College London, United Kingdom (Great Britain)); Adam Noel (University of Warwick, United Kingdom (Great Britain)); Andrew Eckford (York University, Canada)

This tutorial introduces the emerging field of molecular communication wherein chemical signals are used to connect "tiny" machines such as living cells, synthetic biological devices and swarms of micro-scale robots. The tutorial begins with an overview of molecular communication systems and how they are modeled; each has a Transmitter, the Propagation Channel, and the Receiver, just as in a conventional communication system. Specific channel and noise models are presented and the derivation of channel impulse responses is discussed. Signal processing via chemical and genetic circuits is described. An overview and demonstration of specialized simulation tools is provided. The tutorial concludes with a discussion of the recent experimental implementations of molecular communication, and some of the most important open problems in this exciting new area.

TUT11: Optimization and Economics of Mobile Crowd Sensing
Room:
Capital Suite 18
Presenters: Lin Gao (Harbin Institute of Technology (Shenzhen), P.R. China); Man Hon Cheung (The Chinese University of Hong Kong, Hong Kong); Fen Hou (University of Macau, Macao); Jianwei Huang (The Chinese University of Hong Kong, Hong Kong)

Mobile Crowd Sensing (MCS) is a novel and promising paradigm of sensing. It can achieve a flexible and scalable sensing coverage with a low deploying cost, by encouraging mobile users to participate and contribute their smartphones as sensors. In this tutorial, we will discuss the architecture of MCS and the associated key optimization and economics issues. Motivated by several recent academic efforts and business practices, we will discuss the following issues: (i) the participating and reporting behaviors of mobile users; (ii) the impacts of user reputation, social relationship, and diversity on MCS; (iii) the task similarity and the corresponding data reuse among tasks; and (iv) the decentralized P2P architecture for a scalable MCS. The objective of this tutorial is to provide the audience a comprehensive understanding of the current optimizing and business modeling techniques for MCS.

TUT12: Towards Convergent IoT proliferation: Standards and Leading Architectures
Room:
Capital Suite 17
Presenters: Sharief M.A. Oteafy (DePaul University, USA); Hossam S. Hassanein (Queen's University, Canada)

Recent developments in the Internet of Things (IoT) are ever more islandic, pushing the envelope in a myriad of silos. While the research community has produced significant milestones in improving the energy footprint, processing capacity, and overall resilience of IoT systems, today's practitioner is faced with significant challenges in adopting an IoT platform/framework/standard. As a technology that witnessed long strides in development before even agreeing on a definition , the development of IoT systems largely suffer from lack of interoperability and contradicting operational mandates. These challenges are magnified as each of these systems are often generating data with significant volume, variety, veracity and most importantly uncalibrated value. In this tutorial we will elaborate on the chronological and topological evolution of IoT frameworks, targeting a common understanding of the underlying reference models. We will present and contrast leading standards from industry (e.g., SymphonyLink, Thread, LoRaWAN), academia, and research communities (e.g., IEEE P2413 and ETSI), as well as growing alliances that span multiple stakeholders (e.g., AllSeen and Open Interconnect Consortium). We will also present leading attempts at generic reference models that attempt to abstract different layers of operation/perception in current IoT stacks. We then present the ensuing challenge of Big Sensed Data (BSD), and the critical challenges facing IoT proliferation. That is, realizing that often collected data is mainly insightful to each deployed network, any "sense-making" processes to be built upon heterogeneously collected data faces significant interoperability problems, exposing challenges with varying quality, data-labelling inconsistencies, inaccuracies, time-sensitivities and different reporting granularities. That it, sensing systems inherently adopt a collect-and-report model, whereby collected data is indiscriminately pushed onto the networking infrastructure, regardless of the Quality of Information (QoI) or its value (VoI). Thus, many attempts to converge IoT operation by synergizing data repositories, merely shift the problem into the field of Data Science rather than address the discrepancy in operation at the networking level. We argue that sustainable proliferation in IoT operation is coupled with convergence in operational mandates, transcending heterogeneity in resources. Real-time decision making is inherently built on the efficacy of ubiquitous sensing systems, not on the aggregation of devices that are isolated in operation and management. In a time when important IoT applications such as real-time road monitoring, health Informatics and emergency services require rapid and scalable access to contextual information about patients, mobile crowds and the general public, the status quo falls significantly short.

Thursday, 13 December, 09:00-12:30

TUT13: Signal Processing and Optimization Techniques for Ultra Reliable Low Latency Communications (URLLC)
Room:
Capital Suite 13
Presenters: Eduard Jorswieck (TU Dresden, Germany); Muhammad Ali Imran (University of Glasgow, United Kingdom (Great Britain)); M. Majid Butt (Nokia Bell Labs, France)

5G is expected to support Ultra Reliable Low Latency Communications (URLLC) based services, such as industrial control, remote surgery, tactile internet, etc. These are also the most challenging services to implement because they require a new network design and control methodology, in order to satisfy their requirements and enable their co-existence with other types of services that 5G and beyond systems need to deliver. Indeed, today as we enter the Phase 3 of 5G design, it is imperative not only to understand how to deliver URLLC services but also to ensure that they will be offered in a sustainable fashion (i.e., not draining all network resources) that is compatible with the already provided enhanced Mobile Broadband (eMBB) services. The proposed tutorial addresses the signal processing and optimization aspects of URLLC for 5G and beyond networks. The tutorial will cover a novel system design framework, state of the art signal processing and optimization techniques; and introduce cross disciplinary methodology to discuss complex trade-offs in 5G and beyond networks in view of URLLC.

TUT14: UAV Cellular Communications: Practical Insights and Future Vision
Room:
Capital Suite 16
Presenters: Giovanni Geraci and Adrian Garcia-Rodriguez (Nokia Bell Labs, Ireland); Mahbub Hassan (University of New South Wales, Australia); Ming Ding (Data 61, Australia)

The rapid growth of consumer Unmanned Aerial Vehicles (UAVs) is creating promising new business opportunities for cellular operators. One the one hand, UAVs can be connected to cellular networks as new types of user equipment generating significant revenues for the operators. On the other hand, UAVs offer unprecedented opportunity to realize UAV-mounted flying small cells that can dynamically reposition themselves to boost coverage, spectral efficiency, and user quality of experience. Indeed, the standards body is currently exploring possibilities for serving commercial UAVs with cellular networks. Industries are beginning to trial early prototypes of flying base stations, while academia is in full swing researching mathematical and algorithmic solutions to many interesting new problems arising from flying nodes in cellular networks. In this tutorial, we provide a comprehensive discussion of all of these developments promoting smooth integration of UAVs in cellular networks. Specifically, we survey and classify the types of consumer UAVs currently available off-the-shelf, the interference issues and potential solutions identified by 3GPP for serving aerial users with existing LTE networks, the challenges and opportunities for assisting cellular communications with UAV-based flying base stations, the new channel models required for accurate performance evaluation of UAV communications, the cyber-physical security of UAV-assisted cellular communication, and the new regulations being developed to manage the commercial use of UAVs. Finally, we discuss future research directions for UAV communications in 5G and beyond.

TUT15: Wireless Radio Access for 5G and Beyond
Room:
Capital Suite 14
Presenters: Huseyin Arslan (University of South Florida, USA)

Today's wireless services and systems have come a long way since the rollout of the conventional voice-centric cellular systems. The demand for wireless access in voice and multi-media applications has increased tremendously. In addition to these, new application classes like extreme mobile broadband communication, ultra reliable and low latency communications, massive machine type communications, and Internet of Things have gained significant interest recently for 5G. The trend on the variety and the number of mobile devices along with the mobile applications will certainly continue beyond 5G, creating a wide range of technical challenges such as cost, power efficiency, spectrum efficiency, extreme reliability, low latency, robustness against diverse channel conditions, cooperative networking capability and coexistence, dynamic and flexible utilization of wireless spectrum. In order to address these technical challenges, 5G waveforms and radio access technologies (RATs) should be much more flexible. The current 4G systems rely on the orthogonal frequency multiple access (OFDM) waveform, which is not capable of supporting the diverse applications that 5G and beyond will offer. This is because the traffic generated by 5G and beyond is expected to have radically different characteristics and requirements when compared to current wireless technology. For 5G to succeed, numerous waveform alternatives have been explored to best meet its various technical requirements. However, none of the alternatives were able to address all the requirements at the same time. During the standardization of 5G, one thing has become certain: there is no single enabling technology that can achieve all of the applications being promised by 5G networking. This will be even more pronounced beyond 5G. For this purpose, the concept of using multiple OFDM numerologies, i.e., different parameterization of OFDM based subframes, within the same frame has been proposed in 3GPP discussions for 5G. This concept will likely meet the current expectations in multiple service requirements to some extent. However, since it is almost obvious that quantity of wireless devices, applications, and heterogeneity of user requirements will keep increasing towards the next decade(s), the sufficiency of the aforementioned flexibility level remains quite disputable considering future expectations. Therefore, novel RATs facilitating much more flexibility are needed to address the aforementioned technical problems. In this tutorial, we will discuss the potential directions to achieve further flexibility in RATs beyond 5G. In this context, a framework for developing flexible waveform, numerology, and frame design strategies will be discussed along with sample methods in this direction. We will also discuss their potential role to handle various issues in the upper system layers.

TUT16: Vertical-Oriented E2E Network Slicing and Orchestration: Modeling, Optimization, and Implementation
Room:
Capital Suite 17
Presenters: Vincenzo Sciancalepore (NEC Laboratories Europe GmbH, Germany); Marco Di Renzo (Paris-Saclay University / CNRS, France)

The present tutorial provides the audience with a complete survey of the potential benefits, research challenges, implementation efforts and application of technologies and protocols for achieving end-to-end network slicing orchestration, as well as the mathematical tools for their modeling, analysis and optimization. This tutorial is unique of its kind, as it tackles both system-level modeling and optimization aspects, which are usually treated independently. It also provides the vertical-oriented requirements for advanced network service expected in 5G. Finally, we show the viewpoint of the main 5GPPP projects, highlighting the results and open challenges.

TUT17: Rate-Splitting Multiple Access for Next Generation Wireless Networks: Bridging the Extremes
Room:
Capital Suite 15
Presenters: Bruno Clerckx (Imperial College London, United Kingdom (Great Britain))

Numerous techniques have been developed in the last decade for MIMO wireless networks, including among others MU-MIMO, CoMP, Massive MIMO, NOMA, millimetre wave MIMO. All those techniques rely on two extreme interference management strategies, namely fully decode interference and treat interference as noise. Indeed, while NOMA based on superposition coding with successive interference cancellation relies on strong users to fully decode and cancel interference created by weaker users, MU-MIMO/Massive MIMO/CoMP/millimetre wave MIMO based on linear precoding rely on fully treating any multi-user interference as noise. In this tutorial, we depart from those two extremes and introduce the audience to a more general and more powerful transmission framework based on Rate-Splitting (RS) that consists in decoding part of the interference and in treating the remaining part of the interference as noise. This enables RS to softly bridge and therefore reconcile the two extreme strategies of fully decode interference and treat interference as noise. RS relies on the transmission of common (degraded) messages decoded by multiple users, and private (nondegraded) messages decoded by their corresponding users. As a result, RS pushes multiuser transmission away from conventional unicast-only transmission to superimposed unicast multicast transmission and leads to a more general class/framework of strategies, e.g. NOMA and SDMA with linear precoding being special cases of RS. RS will be shown to provide significant benefits in terms of spectral efficiencies, reliability and CSI feedback overhead reduction over conventional strategies used/envisioned in LTE-A/5G. The gains of RS will be demonstrated in a wide range of scenarios: multi-user MIMO, massive MIMO, multi-cell MIMO/CoMP, overloaded systems, NOMA, multigroup multicasting, mmwave communications, communications in the presence of RF impairments. Open problems and challenges will also be discussed.

TUT18: Optical Wireless Communication: Fundamental Limits, New Advances and Future Perspectives
Room:
Capital Suite 18
Presenters: Anas Chaaban (University of British Columbia, Canada); Zouheir Rezki (University of Idaho, USA); Mohamed-Slim Alouini (King Abdullah University of Science and Technology (KAUST), Saudi Arabia)

Optical wireless communications (OWC) has recently gained a lot of interest among industrial and academic communities. The main inhibitor factor of this resurgence of interest is the fact that radio-frequency (RF) spectrum is getting too crowded to handle the increasingly high demand, and hence exploring higher frequency spectrum, including the optical range, would be a relief. Another reason behind such an interest resides in the relatively simple deployment of OWC systems. However, before a real deployment of OWC systems, there is a persistent need to establish its fundamental limits and extract design guidelines to build efficient OWC systems. Indeed, due to different propagation channels and different transmit constraints, RF communications and OWC are fundamentally quite different. For instance, the popular Intensity-Modulation Direct-Detection (IM/DD), which is a favorable scheme for OWC due to its simplicity, has some subtle differences in comparison with radio-frequency (RF) channels manifested in the nonnegativity of the transmit signal, in addition to constraints on the average and peak of the signal. These in turn make the capacity and the optimal transmission schemes for IM/DD OWC channels different from those for RF channel. The goal of this half-day tutorial is to approach the OWC channel from an information-theoretic perspective to highlight the fundamental differences with RF. Consequently, this tutorial will introduce and discuss the most recent information-theoretic results related to OWC, including single-user channels, multi-user channels (broadcast and multiple-access channels), and multi-aperture channels (parallel and MIMO channels), with and without secrecy constraints. This will make researchers acquainted with these results which can be very useful for better analysis and understanding of OWC in the future.

Thursday, 13 December, 14:00-17:30

TUT19: On Network Slicing and Network Softwarisation in 5G Mobile Systems
Room:
Capital Suite 18
Presenters: Tarik Taleb (Aalto University, Finland)

This tutorial will be shedding light on NFV, SDN and Network Softwarisation, an important vision towards the realization of elastic and flexible 5G mobile systems. The tutorial will commence with a brief introduction of major 3GPP wireless technologies, namely GSM, GPRS, UMTS and LTE, comparing amongst the different relevant architectures and their evolution to the nowadays' Evolved Packet System (EPS). After a short discussion on the basic principles of LTE, the tutorial presents the major architectural enhancements that have been already standardized within 3GPP for supporting EPS. The tutorial will subsequently lay emphasis on the business as well as functional and technical requirements of 5G mobile systems and discuss relevant opportunities, challenges, and expectations. The tutorial will be afterwards touching upon cloud computing technologies, virtualization techniques, mobile edge computing (MEC) concepts, and software defined networking (SDN). The main focus will be towards the evolutionary flow of the network virtualization and network slicing operations through several standard definition activities in the last decade. In particular, we shed light on how network slicing operations become feasible in the next generation mobile networks by boiling down the overall overhead and complexity of a full network deployment. The tutorial will also cover the concept of NFV, detailing virtual network function (VNF) management and orchestration, and showcasing NFV, MEC and SDN as key technology enablers for the creation of elastic and flexible 5G mobile systems. The tutorial will be then describing, using concrete examples, how cloud-based virtual mobile networks can be designed, instantiated, configured, managed, and orchestrated, and that using current cloud infrastructure management tools, such as OpenStack and OpenDaylight. The tutorial will finish by highlighting few open issues that are forming the focus of research efforts in the network softwarization arena.

TUT20: Short-Packet Communications: Fundamentals and Practical Coding Schemes
Room:
Capital Suite 13
Presenters: Giuseppe Durisi (Chalmers University of Technology, Sweden); Gianluigi Liva (DLR - German Aerospace Center, Germany); Fabian Steiner (Technische Universität München, Germany)

The design of block codes for short information blocks (e.g., a thousand or less information bits) is an open research problem that is gaining increasing relevance because of emerging applications in the area of low- latency wireless communication. In this tutorial, we shall review the fundamental tradeoff between throughput and reliability when transmitting short packets, using recently-developed tools in finite-blocklength information theory. We will then describe the state-of-the-art code constructions (involving binary/nonbinary LDPC and turbo codes, polar codes, and tailbiting convolutional codes) for the short-block regime, and com- pare their performance with nonasymptotic information-theoretic limits. Specifically, we will illustrate how to achieve performance close to the theoretical bounds with different performance vs. decoding complexity trade-offs. A special emphasis will be given to the LDPC and polar code solutions selected within 3GPP for eMBB data and control channel signaling.

TUT21: Interference Management in Wireless Networks: Fundamental Bounds and the Role of Cooperation (Cancelled)
Presenters: Venugopal Veeravalli (University of Illinois at Urbana-Champaign, USA); Aly El Gamal (Purdue University, USA)

TUT22: Massive Wireless Networks in 5G and Beyond: Spatiotemporal Modeling and Design
Room:
Capital Suite 15
Presenters: Hesham ElSawy (King Fahd University of Petroleum and Minerals (KFUPM), Saudi Arabia); Mohamed-Slim Alouini (King Abdullah University of Science and Technology (KAUST), Saudi Arabia)

Massive wireless networks (MWN), constituted of ultra-dense base stations, WiFi access points, sensors, actuators, machines, connected cars, drones, devices, and many other smart objects (things) will significantly contribute to the big data supply and automation of the foreseen smart world. Such information revolution will be realized via massive, ubiquitous, and energy-efficient wireless connectivity the fifth generation of cellular networks (5G), Internet-of-Things (IoT), and Cyber Physical Systems (CPS). Unleashing the potentials of the upcoming smart world necessitates revolutionary designs and methodologies for wireless networking in order to cope with the unprecedented challenges imposed by the MWN intrinsic characteristics. In particular, such networks are foreseen to emerge in different sectors (e.g., smart cities, public safety, health-care, autonomous driving, etc.), having distinct spatial (e.g., wide-spread topology and massively many nodes), temporal (e.g., sporadic traffic patterns and battery level), and contextual (e.g., heterogeneous devices and diverse applications) aspects. To efficiently design MWN, rigorous mathematical models that capture the essences of these networks are required. In this context, this tutorial presents spatiotemporal mathematical framework, based on stochastic geometry and queueing theory, as a fundamental basis to model and analyze 5G networks and beyond. The theoretical foundations will then be used to design new technologies/services for 5G and beyond. To this end, several research directions will be discussed in the context of IoT, CPS, and 5G systems.

TUT23: Connected Vehicles in the 5G Era
Room:
Capital Suite 16
Presenters: Claudio E. Casetti (Politecnico di Torino, Italy)

This tutorial will cover both the state-of-the-art of vehicular communication and networking, from the point of view of protocols and regulations, as well as the most recent proposals in the context of C-V2X along with the requirements for the novel use cases they are ushering. Overall, the tutorial will provide a comprehensive view of how inter-vehicular communication will fit into the broader 5G landscape.

TUT24: Mobile Edge Computing for Internet of Things (Cancelled)
Presenters: Yan Zhang (University of Oslo, Norway); Xiaofei Wang (Tianjin University, P.R. China)

 

Disclaimer: The tutorials schedule is subject to change of day/time and cancellation.

Star Diamond Patrons

Patrons

Exhibitors