# Current areas of PhD study

This page contains a non-exhaustive list of areas that are currently offered to potential PhD students. Some of these areas have been specified in cooperation with our industrial partners.

- Procedural simulation of natural phenomena
- Procedural generation of cultural heritage environments
- Crowd modelling techniques for virtual environments
- Efficient formal analysis and verification of programs written in imperative languages
- Advanced techniques in modelling and verification of network-based services
- Algorithms and tools for automatic analysis and verification of programs
- LTL to automata translations and their applications
- Computational methods for systems biology
- Natural language processing and artificial intelligence
- Speech processing, dialogue systems and assistive technologies
- Advanced searching methods for digital data
- Structural and topological graph theory
- Parameterized complexity and algorithms
- Triangulated models for haptics and virtual reality
- Analysis of cell images in optical microscopy
- Models for programmability and traceability of self-organizing communication systems
- Quantum information processing - quantum algorithms, automata, cryptography
- Machine learning and data mining
- Mathematical modelling and numerical simulation of the transport of organic pollutants in air
- Quality-driven optimization in large architectures

#### Procedural Simulation of Natural Phenomena

This research aims to develop novel computer graphics algorithms for creating realistic landscapes that can be used for serious games and virtual environments. The underlying mechanism which is widely used is not different from the 1986’s “The Definition and Rendering of Terrain Maps”. Several new rules, such as accounting for gravity, have been later added on top of the respective framework. In this research, real and natural phenomena will be taken into consideration which will be used to better approximate landscapes and the focus will be on erosion and fluid modelling. With respect to the terrain erosion, a multi-fractal algorithm will be built on top of existing methods. The novelty of the algorithm is that it will account for elements that cause erosion that have not been considered before, such as the consideration for the movement of the tectonic plates that, in reality, gives birth to terrain. For the fluid modelling part, the work will focus on approximating Navier-Stokes equation as well as existing methods to predict the behaviour of lakes and sea under certain conditions.

Candidate: Looking for a highly motivated candidate who holds a Master of Science in Computer Science or Computer Engineering with a solid background in computer graphics or computer vision. The candidate must have excellent programming skills in an object oriented programming language like C++. Experience in OpenGL or DirectX is also desirable. Fluent communication skills in English are expected.

**Supervisor: **doc. Fotis Liarokapis, PhD

#### Procedural Generation of Cultural Heritage Environments

Computer visualization of ancient settlements and ruins is very important and useful for several disciplines and branches of science, such as archaeology, architecture, creative industry, and scientific fields related to exploration of cultural heritage. Procedural modelling of cities offer a new way of generating the same level of detail as the manually modelled versions but it has not been explored in the field of archaeology, due to the geometrical complexity of objects and buildings. The aim of this project is to generate unique grammar for the generation of specific case studies. In addition, to investigate algorithms which directly render curved surfaces, such as NURBS (non-uniform rational B-splines) or Bezier surfaces. These surfaces can represent complex 3D objects in a much smaller number of surfaces than planar representations, but the rendering algorithms are more complex. The student will initially study the mathematics of curved surface representation, and subsequently implement novel algorithms for complex geometrical shapes.

Candidate: Looking for a highly motivated candidate who holds a Master of Science in Computer Science or Computer Engineering with a solid background in computer graphics or computer vision. The candidate must have excellent programming skills in an object oriented programming language like C++. Experience in OpenGL or DirectX is also desirable. Fluent communication skills in English are expected.

**Supervisor: **doc. Fotis Liarokapis, PhD

#### Crowd Modelling Techniques for Virtual Environments

Topic: Although various methods of procedurally crowds have been proposed in recent years, the problem of accurately modelling the behaviour of crowds still exists. This project will examine different crowd simulation algorithms that can be used in a virtual environment such as a virtual city. The main categories of crowd simulations are flow-based, entity-based and agent-based approaches. Flow-based approaches can simulate highly dense crowds not by considering individuals in the crowd, but by treating it as a continuous flow of fluid. Entity-based crowds model individuals as a set of similar entities, typically in a setup closely relating to a particle system. Agent based approaches implement agents as autonomous individuals, often capable of reacting to certain events and changes within the virtual environment. The student will have to research in the above types and develop a novel algorithm for crowd modelling for two different case studies. This research is expected to feedback into the development processes of simulating inhabited locations, by identifying the key features which need to be implemented to achieve more perceptually realistic crowd behaviour. The main areas of research include computer graphics, artificial intelligence and image processing.

Candidate: Looking for a highly motivated candidate who holds a Master of Science in Computer Science or Computer Engineering with a solid background in computer graphics or computer vision. The candidate must have excellent programming skills in an object oriented programming language like C++. Experience in OpenGL or DirectX is also desirable. Fluent communication skills in English are expected.

**Supervisor: **Fotis Liarokapis, Ph.D.

#### Efficient formal analysis and verification of programs written in imperative languages

In practice, the currently dominating approach to verification of programs written in
imperative languages (such as C, C#, Java, etc.) is *testing*. A given
program is repeatedly run on a large selection of input data, and the outcomes of these
trial runs are evaluated and analysed. The main disadvantages of testing are its
expensiveness and principal unreliability.

A more sophisticated approach to program analysis and verification is based on
*formal methods*, which allow to discover certain type of errors or prove
their absence just by analyzing the source code of a given program. The main disadvantage
of formal methods is their computational complexity, which often does not permit
to analyze programs of a real size.

The topics of PhD theses in this area are focussed on developing practically usable formal methods that are applicable to real-world programs written in imperative languages. The expected outcomes are new theoretical concepts and their implementation into an integrated software tool.

**Supervisor: **prof. RNDr. Antonín Kučera, Ph.D.

**Co-supervisor: **Mgr. Jan Obdržálek, PhD. (contact person)

#### Advanced techniques in modelling and verification of network-based services

The aim of this research topic is to develop rigorous methods for modelling and analysis of various network-based services. The major problem is that while the functionality specifications (protocols) are often formally described, they are not complete and leave space for different interpretation in the implementation phase. Therefore, functionality of actual implementations is influenced by only vaguely defined assumptions, which lead to incompatibility and instability of a system composed of two or more interacting components.

We study formalisms for network-based system specification at a very early stage of design. Variety of studies has been done on Message Sequence Charts (MSC) formalism in this research area. Even though MSC is formally specified in ITU Recommendation Z.120, most of the theoretical results deals with some subsets of this formalism only. On the other hand, there are many undecidability results related to the MSC in its full expressive power.

The aim of this research is to find a subset of MSC (or its useful modification) such that it can express important design features and, at the same time, it preserves decidability of significant verification questions.

**Supervisor: **prof. RNDr. Mojmír Křetínský, CSc.

**Co-supervisor: **RNDr. Vojtěch Řehák, Ph.D. (contact person)

#### Algorithms and tools for automatic analysis and verification of programs

The aim of this research topic is to develop techniques and tools for automatic program analysis (where a program can be given by its source code or an executable). Some of the most frequently used techniques are model checking, bounded model checking, abstract interpretation, symbolic executions and their combinations. Different techniques provide outcomes of different forms and applications. The outcomes are typically used for finding bugs in a given program (either directly or by generating tests well covering the program code) or to prove that the program satisfies its specification. The current program analysis methods can, for example, find bugs in heap manipulation (memory leaks, NULL pointer dereferences), locking errors etc.

Unfortunatelly, current techniques are usually computationally hard (thus they do not explore the whole program) or they are imprecise (thus they report bugs that do not appear in any program execution). A PhD thesis in this area can be focused, for example, on suppressing of these deficiencies or on development of tools applicable to real-world programs.

**Supervisor: **
doc. RNDr. Jan Strejček, Ph.D.

#### LTL to automata translations and their applications

Linear Temporal Logic (LTL) is one of the most popular formalisms for description of properties of computer system runs as it can succinctly express many interesting properties and it has a simple semantics. Translations of LTL to various types of automata are studied for decades and they have found many applications, mainly in program verification (e.g. in model checking, bounded model checking), system monitoring, system synthesis, analysis of probailistic systems etc.

A PhD thesis in this area can be focused, for example, on optimization of LTL to automata translation for specific applications or on optimization of the translation for specific LTL fragments. Another possible topic is an employment of less common automata types (as a target automata type of the translation as well as an input of subsequent applications).

**Supervisors: **
prof. RNDr. Mojmír Křetínský, CSc.,
doc. RNDr. Jan Strejček, Ph.D.

#### Computational methods for Systems Biology

Topics of PhD thesis in this area are related to the developement and application of computational science and technology to enhance our understanding of the molecular mechanisms underlying the behavior of living systems and development of scalable methods and tools for modeling and computerized analysis of large and complex biological systems.

**Supervisor: **
prof. RNDr. Luboš Brim, CSc.

**Co-supervisor: **
RNDr. David Šafránek, Ph.D.

#### Natural Language Processing and Artificial Intelligence

The research is directed namely to: text corpora and corpus tools, syntactic and semantic analysis of natural language, dialogue and question-answering systems, computer lexicography together with machine readable dictionaries, and last but not least, machine translation. The field of Natural Language Processing is closely related to the issues of AI and it intertwines with knowledge representation which comprises semantic networks, logical calculi (deduction systems) and various types of ontologies. Current issues of the Semantic Web also belong here.

From the methodological point of view this is an attractive area, in which we encounter difficult problems, however, their solution can make human-computer communication easier. Presently it can be characterized as exclusively 'one-way', i.e. it means that humans have to have the sufficient knowledge about the structure of the software to be able to use computers on a reasonable level. The goal we want to reach, however, is rather a 'two-way' communication, which will make computers closer to humans who do not possess the mentioned knowledge. The contemporary research paradigms exploit rule-based and statistical approaches, or the hybrid ones.

The topics of the dissertations in this area are oriented to the solutions of the problems mentioned above. In the dissertations the students are expected to seek for new theoretical approaches and techniques in Natural Language Processing as well as their implementation having the form of the tools usable in further research or in practical applications.

Those topics have been developed in the framework of the NLP Laboratory which came to the existence at the Faculty of Informatics in years 1997-8. Presently the Laboratory is named the Natural Language Processing Centre which cooperates with the Seznam Company for longer time, thus topics can be formulated that are suited to this cooperation. Another interesting opportunity is a research cooperation with the British Company Lexical Computing Ltd. The papers based on successful dissertations can be published in the proceedings of the international conferences and reviewed journals related to NLP and AI.

**Supervisor: **
doc. PhDr. Karel Pala, CSc., doc. RNDr. Aleš Horák, Ph.D.

**Co-supervisor: **
doc. Mgr. Pavel Rychlý, Ph.D.

#### Speech processing, dialogue systems and assistive technologies

Speech processing, covering speech recognition and synthesis, belongs to the most interesting fields of computer science, producing applications that will be even more important in a near future. Exploiting speech technologies, dialogue systems form intelligent human-computer interface primarily based on speech. Assistive technologies aim to the applications supporting quality of the life, especially in helping handicapped people.

In this field, the topics of the theses may be related to specific problems that are solved within the Laboratory of Speech and Dialogue. For instance, dialogue generation of graphical objects and www pages, speech synthesis and recognition, assistive technologies; however, all research ideas are welcome.

**Supervizor: **
doc. RNDr. Ivan Kopeček, CSc.

#### Advanced searching methods for digital data

Our research focuses on fast similarity search in large data collections. Standard techniques based on centralized directories lack the scalability needed to process the exponential growth of data of current digital explosion. On the other hand, distributed systems (such as GRID or peer-to-peer networks) offer the necessary resources. We concentrate our efforts on both structured and unstructured distributed systems. A very important area of research is the specification of similarity measures, i.e. the metrics used to compare pieces of data. The choice of an appropriate metric function is crucial not only for the quality of the searching results but also for its response time.

Possible topics for PhD theses:

- Ranking and Relevance Feedback in Image Retrieval
- Similarity Search Architectures for WEB Databases
- Self-Organized Search Networks
- Computational Advertising
- Clustering and Categorization in Metric Spaces
- Similarity Search Applications: video, audio, music
- Similarity Searching: Beyond the Metric Space
- Data Cleaning and Integration
- Collaborative Filtering
- Performance Tuning on Distributed Hardware Infrastructures
- Approximative Similarity Search
- Dimensionality Reduction Techniques

**Supervisor: **prof. Ing. Pavel Zezula, CSc.

#### Structural and topological graph theory

We study applications of deep theoretical results of structural and topological graph theory in design of new faster algorithms. This is a theoretical topic with expected collaboration with some of the leading international experts in these areas. We propose the following specific research topics, but other related topics can be arranged on demand:

- Non-traditional width parameters of graphs, e.g. rank-width, DAG-width, or others. Their mathematical properties, and algorithmic use.
- The crossing number of a graph: structure of crossing-critical graphs, efficient computation/approximation of the crossing number for graphs close to planarity.
- Other problems of structural and topological graph theory related to algorithms and complexity.

**Supervisor: **
prof. RNDr. Petr Hliněný, Ph.D.

#### Parameterized complexity and algorithms

We focus of question and problems in a rather new CS area of so called parameterized complexity. This theory provides a new look at difficulty of many NP-hard problems, allowing for efficient solutions under additional restrictions (parametrization). This is, again, a theoretical field with expected collaboration with some of the leading international experts. We propose the following specific research topics, but other related topics can be arranged on demand:

- Applying results of structural graph theory in parameterized complexity and in parameterized algorithm design.
- Studying new "width" parameters of graphs and digraphs, and comparing their algorithmic strength.
- Algorithmic metatheorems and metakernelization results based on expressibility of problem classes in suitable logic (FO, MSO).
- Use of structural graph theory and suitable decompositions of graphs in faster and memory-limited route planning algorithms on huge graphs.

**Supervisor: **
prof. RNDr. Petr Hliněný, Ph.D.

#### Triangulated models for haptics and virtual reality

Triangulated models have been the main tool for representation of geometric objects since the very beginning of computer graphics. There are several categories: first, 2D triangulations, which can be used for modeling of terrains or other flatland objects; they can be also made in a parametric space and mapped to a surface of a geometric model. Second, 3D triangulations or tetrahedronizations used to model geometric bodies. Third, surface triangulated models, obtained by surface reconstruction from scattered points or by piecewise linear interpolation of various functional representations. Many algorithms and methods for all these models have been developed; most of them work well in the context of computer graphics and its typical applications. However, if the input data or application requirements are large and moving, with non-geometric criteria of triangle optimality for soft deformations and other shape changes, the triangulated models topics are far less solved and known solutions far less satisfactory for new application areas.

The main goal of the research is to develop algorithms and methods suitable for VR simulations and to use them in this context. The developed solution will be used mainly in the context of multimodal human computer interaction methods, namely for training applications.

Possible topics for PhD theses:

- Algorithms for fast location on the surface of geometric models.
- Local refinement and coarsening of triangular surface during a haptic interaction.
- Triangulated models based on kinetic data; research of non-geometrical criteria of optimality.
- Modeling of deformations and evolution of geometrical objects.
- Path planning/searching in the VR applications in partly known or unknown environment, with possibilities to re-plan rapidly the path in case of objects shape changes.
- Fast frame-to-frame collision detection of triangulated models as an input for correct force-feedback calculation.
- Development of fine rigid as well as flexible tools with dense triangulated surface structures and their application to fine grained haptic interactions.
- Design and implementation of detailed surface models for VR scenes, allowing simulation of interactions of a type human controlled tools interacting with rigid and soft surface models.

**Supervisor: **
doc. Ing. Jiří Sochor, CSc.

#### Analysis of cell images in optical microscopy

Modern molecular biology is primarily concerned with the studies of structure,
function and dynamics of cellular components. For this purpose it mainly uses
optical microscopy that enables capturing of *spatial (3D) image
data* as well as *time course* of cellular processes in live cells
without their damage (in natural conditions). Motivation of this research
is the effort to describe cellular processes that lead to *deleterious (especially cancer)
human diseases* and, based on this knowledge, to develop new efficient
diagnostic as well as therapeutic approaches.

Because these studies produce huge multi-dimensional image data
that must be captured, digitized and processed using computer in the appropriate manner,
they can't be realized without experts in the field of digital image processing.
Such help is offered to molecular biologists by the
Centre for Biomedical Image Analysis
at FI MU. This Centre has not only computer equipment available but also
*molecular biology laboratories* for the preparation of cells and *optical
laboratories* equipped with unique automated microscopes driven by
our own software. Hence, the Centre handles the whole process of cell sample studies
starting with taking of the cell sample in hospital and ending with the evaluation of the
results produced by computer.

Participation on the activities of the Centre is possible also in the way of PhD studies. In this case not only new method proposals are expected as output but also their functional implementation in the format compatible with the existing library that is continually being developed in this Centre.

More specifically attention is paid mainly to the tasks in the area of
*segmentation of cells* and their components,
in the case of live cells also to the *tracking of moving objects* inside
the cell as well as movement of the cell itself. We usually make an effort to
*automate* and *optimize* the developed methods to large extent
with respect to quality and/or quantity of produced results.
We also pay attention to the *simulation of image formation* in optical systems:
creating models of the cells, of the blur caused by optical system, of the electronic noise
and of the other artifacts in order to have the possibility of assessing the reliability
of the developed image analysis methods as well as to find the ways of correcting the
mentioned imperfections using hardware and/or software means.

The prerequisites for the work on these topics is the will to learn something new from other fields (especially optics and molecular biology) and to communicate with colleagues with these backgrounds who are employed also directly by the Centre. It is expected that the student typically spends one semester in a similar centre abroad during the PhD studies.

**Supervisor: **
prof. RNDr. Michal Kozubek, Ph.D.

**Co-supervisors: **
doc. RNDr. Pavel Matula, Ph.D.,
doc. RNDr. Petr Matula, Ph.D.,
RNDr. David Svoboda, Ph.D.
(contact persons)

#### Models for programmability and traceability of self-organizing communication systems

This dissertation work topic is to develop programming models, that allow modification of the self-organizing systems behavior without disrupting basic self-organizing constraints. The ability to modify the behavior will allow to create more flexible communication systems, where user can modify properties like trade-offs between speed of network establishment (within given requirements) and it properties (like real transmission rate, latency, level of robustness based on link redundancy). The work will focus especially on finding such models, that also allow to understand behavior of the self-organizing systems being established and that provide at least basic means for tracing and debugging. This means they need to enable monitoring of system behavior during self-organization process and provide reproducible environment to reproduce and analyze errors and unexpected states. Results of such a research can be integrated into current CoUniverse Framework.

Time frame: The work could start in academic year 2008/2009 by doing bibliographical research and specification of requirements for the model. 2009/2010: proposals of first generation of the programming model, that will guarantee global properties of the self-organizing system. 2010/2011: Work on tracing and debugging abilities, second generation of the model, experiments and simulations. 2011/2012: Finalization of experiments, evaluation and finalization of Ph.D. thesis.

**Supervisor: **
prof. RNDr. Luděk Matyska, CSc.

**Co-supervisor: **
doc. RNDr. Petr Holub, Ph.D.,
(contact person)

#### Quantum information processing - quantum algorithms, automata, cryptography

Quantum information processing is a new trend in the field of information processing, computation, communication and security. Specific laws of quantum mechanics and properties of microscopic quantum world can be harnessed to solve problems that cannot be solved with classical computers or to solve them more efficiently.

The research focuses namely on quantum finite automata, quantum algorithms, design of quantum circuits and protocols, and on various problems of quantum cryptography - quantum key generation, encryption, authentication, anonymity etc.

Topics of PhD theses can also be related to specific problems of the theory of quantum information, quantum error-correction codes, quantum entanglement etc.

Students are expected to join the Laboratory of Quantum Information Theory and Cryptography, The laboratory regularly organizes research and "hot topics" seminars where the new results are presented. The lab is one of the organizers of international workshop CEQIP (Central European Quantum Information Processing) held each year in the Czech Republic.

** Supervisor: **
prof. RNDr. Jozef Gruska, DrSc.

** Co-supervisors: **
doc. RNDr. Jan Bouda, Ph.D.,
doc. Mgr. Mario Ziman, Ph.D.,
prof. RNDr. Tomáš Tyc, Ph.D.
(contact persons)

#### Machine learning and data mining

**Supervisor: **
doc. RNDr. Lubomír Popelínský, Ph.D.

#### Mathematical modelling and numerical simulation of the transport of organic pollutants in air

Mathematical modelling and numerical simulation of the transport of organic pollutants in air and their transformations in and across the interfaces of compartments adjacent to environment (air, biota, soil, sea surface). The extension of an existing air pollution model to a fully integrated regional multi-compartment model. The objective is the development and implementation of the mathematical model and numerical simulation on the platform of high performance computers (CERIT SC) including interpretation of regional and long-term monitored data in cooperation with the Institute RECETOX Faculty of Science.

For illustration, we next couple of possible topics, but other related topics can be easily supplemented by agreement:

- Model simulations and visualisation and evaluation of model results by experimental (field, monitoring) data and model data from other sources.

- Development of a numerical model of organic pollutant cycling in the lower atmosphere and in exchange with ground surfaces (soils, sea surface) as a combination of existing components, i.e. an atmospheric dispersion model and atmospheric chemistry, aerosol and surface mass exchange modules.

- Programming and implementation of subroutines for data input and utilisation of model data from other sources.

**Supervisor: **
prof. RNDr. Jiří Hřebíček, CSc.

#### Quality-driven optimization in large architectures

In both research and industrial development, a high number of methods exists that guide the development of functionally correct systems (including formal verification or testing). However, the functional correctness is only one of many quality attributes of software systems - next to the performance, reliability, security, energy consumption, maintainability, and many others. With the increasing complexity of software systems, and their architectures in particular, new techniques are needed for the quality-driven design and optimization of large architectures, targeting the desired system qualities.

The topics of the PhD theses in this area focus mainly on reliability, performance and scalability of applications in two main contexts. The first one are smart infrastructures, used namely within the context of smart energy grids, and the second one are optimal architectues of applications deployed into cloud environment.

**Supervisor: **
doc. RNDr. Tomáš Pitner, Ph.D.

**Co-supervisor: **
Ing. RNDr. Barbora Bühnová, Ph.D.

Responsible contact: Ing. Dana Komárková