Strategic Vision

Information Processing Systems

The large-scale distributed information processing systems envisioned by the Nano-Tera.CH initiative will generate unprecedented amounts of data, which needs to be processed, stored and efficiently transmitted over large-scale highly heterogeneous wireless networks. This poses tremendous challenges, detailed below, in various areas of information processing.

Performance Limits and Fundamental Information Processing Algorithms and Primitives

Nano-tera networks will need to cope with large scales of small and unreliable components. The traditional methods that successfully helped engineers to design small to moderate size networks are not suited for such large scale systems. Techniques that can deal with massive scales, unreliable components and device heterogeneity encompass, for example, large-deviation analysis, percolation theory, large random matrix theory and network information theory. We will use these tools and develop new tools, if needed, to establish the communication-theoretic performance limits of large-scale wireless networks under realistic modeling assumption. In addition new signaling strategies, distributed signal processing and control algorithms need to be developed.

Information-theoretic performance limits: Gupta and Kumar proposed in 2000 to analyze the capacity scaling behavior when the number of nodes in the network grows large. This line of attack to network information theory allows us to understand the ultimate performance limits of large networks but has focused predominantly on idealistic assumptions on the network, such as non-fading (i.e. additive white Gaussian noise) links or infinite scattering richness if the links are assumed to be fading. While the latter assumption may be realistic for small node densities, it is questionable in the large node density case. Rather, it seems reasonable that a given volume exhibits only a finite number of ``degrees of freedom''. Characterizing this aspect from a propagation-theoretic point-of-view and studying the resulting impact on the capacity scaling behavior is an interesting open problem. For a finite number of nodes the impact of signaling strategies, protocols, network geometry, path loss and node characteristics (i.e. cheap vs. high-end nodes with significant signal processing capabilities) shall be analyzed through percolation theory and large deviation theory. In particular, the aspect of cooperation between nodes and across OSI layers is expected to lead to fundamentally new effects on the system level.

Signaling and Communications techniques: The insights gained from the analysis of communication-theoretic performance limits will be used to design signaling strategies, distributed signal processing and control algorithms, and evaluate their performance. We will include the impact (and modeling) of the scattering environment, multi-hop relaying, and cognitive radio. Some of these methods, at the boundary with other fields, will apply not only to wireless networks, but also lead to insights on general large-scale systems, such as systems on chips, where the worst-case design may become too expensive at some point, and where randomness eventually needs to be accounted for.

The legacy network design philosophy, which decomposes the traffic into point-to-point connections, is not well suited for ambient systems. We consider new communication techniques such as network coding, rate-less coding, distributed source compression and distributed message passing algorithms ("gossip algorithms") used for example to collectively sample and measure a spatially distributed field of variables of interest. These techniques hold the promise to revolutionize the way we operate networks. They need to be decentralized and robust with respect to the randomness and heterogeneity of the environment.

Design and System Theory of Quantum Information Processing

During the last decade, information scientists and physicists have teamed up to invent new algorithms that will operate on quantum computers, which have been shown, in theory, to have the potential to vastly outperform today's computers. New cryptographic schemes have been discovered guaranteeing (previously impossible) properties such as unconditionally secure communication. Research in quantum information science has the potential to revolutionize cryptography, computing and communication. Quantum cryptography is particularly relevant as it is a promising means to achieve data and communication security, a property crucial for all envisioned Nano-Tera.CH applications. The main technological challenge in the field is to design and fabricate quantum coherent devices, which can store and process this new type of information. The experimental control of quantum systems may not only be the basis for future quantum information processing, but is also required to study present day electronic systems at their ultimate size limits (which are being approached in nano-scale systems) or to develop new hardware for storage.

The main emphasis of our research in this area is on the development of quantum tools, the enabling technologies for control, manipulation, and readout of quantum systems. This goal shall be pursued on a multi-track path developing alternative schemes based on quantum optics, condensed matter, and atomic and molecular physics. Experimental work is complemented by work on algorithms with a focus towards taking realistic models of quantum information processing devices into account. Another line of research is the understanding of quantum information processing using continuous-variable systems.

Signal and Image Processing

A sensor network typically takes a large number of measurements of a physical quantity (e.g., a temperature field) through scattered (low-cost) sensors. The data fusion task is then to extract physical quantities from a (typically) highly redundant set of very noisy measurements. We plan to develop algorithms for data fusion, parameter estimation and (networked) control from highly redundant measurements, and also for filling-in missing information (caused by faulty sensors) spatially through generalized, non-uniform interpolation. Furthermore, we plan to study the corresponding ultimate limits on control ability and reconstruction and estimation accuracy as a function of the (statistical) properties of the quantity to be sensed, sensor characteristics, sensor density and sensor locations.

Aerial and satellite imagery will produce information which needs to be automatically processed with the help of computers. We plan to automatically produce 3D models of the environment from images and video sequences, which is notoriously difficult because image-data is both noisy and potentially ambiguous. We will address this problem by developing a model-based optimization approach in which an objective function is used to express both geometric and photometric constraints on features of interest. Because features and surfaces will be uniformly modeled, this will allow us to refine several models simultaneously and enforce geometric and semantic constraints between objects, thus increasing not only the accuracy but also the consistency of the reconstruction.

Novel imaging modalities will provide indirect measurements (e.g., line/region integrals in cell tomography) of the structure and/or function of the human body down to nanoscopic scales. We will take advantage of recent advances in wavelet theory and compressed sensing to derive specific reconstruction algorithms to improve the resolution and/or to handle extreme cases where the data is incomplete (ill-posed problems). The work will involve a close interaction with the engineers developing the instrumentation and with the end user; important issues are the adequate modeling of the image formation process, the numerical efficiency of the algorithms, as well as their validation.

Large-Scale Data Management

The envisioned systems require handling data of size far beyond the current limits, often continuously streamed rather than being relatively static as in traditional databases, and obtained from heterogeneous sensing infrastructures. We plan to develop data management tools and algorithms for processing continuous data streams produced from large collections of sensors and actuators. This will embrace both efficient in-network data processing in wireless sensor networks, middleware based on service-oriented computing architectures for integrating heterogeneous sensor and actuator networks, and storage systems for indexing and retrieval of massive volumes of stored sensor data streams.

Since sensor data will be typically post-processed through signal processing, data cleaning and data fusion algorithms efficiently supporting data access through this abstract, model-based, views is essential for practice. Implementing model-based views poses challenging cross-layer optimization problems, involving sensor networks, stream processing middleware and storage systems. Due to the highly dynamic nature of nano-tera systems trading off efficiency for flexibility and generality will be a key engineering challenge. Resulting systems and algorithms have to be highly autonomous and self-organizing since manual configuration and administration forbids itself due to the system scales.

Ambient Web

Data generated by embedded devices and sensor networks will, in the future, not only be used by dedicated applications, but also published over a Web-like infrastructure, for enabling novel services and community-based applications at global scale. Data publishing requires data interpretation for analysis, correlation and decision making. Data interpretation is a highly application-dependent and subjective process and will thus result in many different models. Data from different sources and their models will be correlated in semantic networks which form logical overlay networks over the physical data space, whereas data users will be connected in social networks.

We plan to develop automated techniques to interpret and integrate data generated through nano-tera systems, and their corresponding semantic and social networks. For data interpretation it is necessary to assess the quality of data sources and publishers while ensuring data privacy. Since most of the data and their interpretation are inherently uncertain, the approaches will be typically based on distributed probabilistic inference techniques, combining symbolic reasoning with content analysis. The techniques will exploit automatically generated content as obtained from sensor fusion and image analysis as well as input and feedback obtained from users and automatically derived systems context. Systems and tools resulting from this research will consider recent developments and standards in semantic Web technologies for seamless integration with the current Web infrastructure.

Communication Systems

Wireless networks relevant to the Nano-Tera.CH initiative will require high-end nodes acting as (cellular) infrastructure centers of the network as well as communication schemes that lead to cheap transceivers suitable for short-range communication. While for the first category there is little doubt that these types of nodes will be based on multiple-input multiple-output (MIMO) technology, we expect that for the second category ultra-wideband (UWB) communication may be a suitable technology. For both technologies, major challenges remain in the areas of antenna miniaturization and realistic test bed design, crucial to demonstrate systems engineered within Nano-Tera.CH.

Ultra-Wideband Communication

With the ever-increasing demand for higher data rates and new services, new spectrum needs to be allocated. However, the available spectral resources are already crowded and mainly used by bandwidth-inefficient legacy systems such as terrestrial television and radio, analog two-way intercom and business systems. Since a complete reallocation of frequency bands for existing wireless applications is out of question, the search for alternatives is of major importance. Wireless systems operating over UWB channels with several Gigahertz of bandwidth at very low power levels per Hertz promise a solution to this problem due to their ability to coexist with legacy systems. Moreover, UWB simplifies multiple access, offers improved link reliability and bears potential for ranging applications. Important issues to be resolved include: (1) the characterization of the communication-theoretic performance limits under realistic channel models; (2) design of novel techniques and methods for timing synchronization, low-complexity interference-robust pulse detection; and (3) understanding how data-carrying UWB signals can be employed for ranging purposes.

High-End Wireless Communication Transceivers

The wireless industry has just started to integrate MIMO techniques into existing standards and to define new standards based MIMO, e.g. the IEEE 802.11n standard for high-speed wireless local area networks (WLANs), the IEEE 802.16 standard for fixed broadband wireless access, and ongoing 4G pre-standardization efforts. A number of key issues remain to be resolved before the technology can be put to efficient use in practice, both in general and in the context of Nano-Tera systems. Probably the most pressing problems can be found in the area of VLSI implementation of MIMO transceivers and their integration (along with the RF front-end) into small devices capable of supporting multiple standards such as GSM and W-CDMA (convergence). The latter point is particularly important in the context of heterogeneous tera-scale systems where a large number of nodes operating with potentially different communication standards (single and multiple-antenna based) need to be coordinated.

VLSI implementation: The ultimate performance test for any algorithm to be used in a wireless communication system is its implementation in a real-time test environment. Ongoing real-time wireless MIMO test bed design in the participating institutions will be consolidated and extended. Research in this area shall focus on: (1) the VLSI implementation of MIMO transceiver algorithms; (2) the development of concepts aimed at exploiting MIMO gains on the system level; (3) the use of relaying to improve the scattering conditions for MIMO communications.

Integration and support for multi-standard operation: The key problems in this context are as follows: (1) development of RF and analog techniques, architectures and circuit configurations that enable and/or favor reconfigurability and ensure compatibility to existing wireless standards; (2) design of circuit configurations applicable to nano-scale CMOS technologies that operate under very low supply voltage, in densely populated device environments and are robust to poor leakage characteristics; (3) design of architectures, circuit and device techniques for the realization of extremely low power RF, analog and digital circuits required for wireless sensor networks as well as personal and body area networks; (4) incorporation of MEMS devices into low power and reconfigurable systems.

Miniaturization of Antennas

The performance of off-the-shelf wireless units used for sensor applications (using GSM or ISM bands) is susceptible to environmental conditions. The layout of the antenna, the design of the front-end and the integration of sensor functions are important aspects of the overall system development. The use of Low-Temperature Cofired Ceramic (LTCC) technology and its combination with miniaturized antennas allows efficient integration with sensor functions to a rugged and robust front-end package useful for many applications ranging from on-body sensor networks, to communication systems (MIMO), to environmental sensors.

For ambient sensor networks, pico-satellite applications, and wearable systems, we propose to investigate new concepts in macro antenna design where a tradeoff must be made between miniaturization and radiation efficiency. Parallel to the reduction in size, further issues need to be taken into account, such as performance in adaptive antenna applications, ease of manufacturing, large bandwidths, and reconfigurability for multiple operating bands. To enable future micro/nano-systems (i.e. smart dust) or system-on-chip (SoC) applications such as RFIDs, we propose to investigate micro antennas. Here, the combination of new materials and sophisticated electromagnetic/based simulation tools are expected to lead to better sub-wavelength antenna performance. This problem is even more pronounced in on-chip interconnects for nano-scale electronics. In this context the use of plasmonic waveguides and plasmonic-based nano-sized antennas is expected to overcome the large size mismatch between electronics and photonic components (limited by the fundamental law of diffraction). This technology can lead to radically new chip-scale device technology facilitating information transport between nano-scale devices at optical frequencies and bridge the gap between the world of nano-scale electronics and micro-scale photonics.

Wireless Sensor and Actuator Networks

Wireless sensor networks allow fine-grained measurements and thus provide more precise spatial and temporal data, e.g., for model verification in environmental science or for building monitoring and control. A first generation of platforms (motes) is now industrially available and deployed for a variety of applications. A number of problems however make scaling up beyond a few hundred of nodes very hard, if not impossible. In context of the envisioned ambient systems, we will particularly investigate the problems of: (1) autonomously deploying, maintaining and administering large-scale wireless sensor networks; (2) enabling, supporting and exploiting mobility of wireless sensor nodes (most current deployments assume static nodes); (3) designing realistic test beds for testing wireless sensor networks at large scale to overcome the evident shortcomings and fallacies of todays simulation tools.

Related Files

Download PDF of nano-tera.ch general presentation.

Learn more about
Application systems.

Learn more about
Enabling technologies.
©nano-tera.ch 2007-2008