Program seminářů pro rok 2008/2009

Podzim 2008

18. 9. 2008
Úvodní seminář podzimního semestru
Program:
Informace o koncepci semináře v podzimním semestru.
Domluva programu semináře.
Diskuse.
25. 9. 2008
Michal Hejč: Data Quality Model
Abstract:
The presentation starts with a short overview of the research in the field and with basic terms definitions, heading towards a new model definition. It introduces a prototype of the model and also a model of data value. The use of the model and its role in the process of data quality measurement is also discussed. The concept is illustrated by the case study of South Moravian waste management data evaluation and compared with the approach of the Ministry of Environment of the Czech Republic. Conclusion suggests future exploitation of possible use of the model for data from the web community.
2. 10. 2007
Václav Němčík: TBA
Abstract:
At present, anaphora resolution is one of the biggest challenges in the field of natural language understanding. Despite anaphora plays an important role in human communication and is in its essence an interdisciplinary issue, it is not widely familiar to computer scientists. Therefore, this talk provides a brief insight into the relevant linguistic background, and an overview of various falvours of anaphora and their computational aspects. Finally, it mentions state-of-the-art methods for AR and sketches research aims on the way towards a badly-needed anaphora resolution system for Czech
9. 10. 2008
Jan Vlach: Gaussian Quantum Marginal Problem
Abstract:
If we have some composite system in a given quantum state we could ask if some reduced states are compatible with the state of the whole system in terms of its spectrum. This question is very important because it is common to investigate large quantum systems by local unitary operations. Gaussian version of the quantum marginal problem deals with Gaussian states which can be described only by their first and second moments of canonical coordinates. For 3 modes this problem is solved but I will try to show that it is possible characterize these reductions generally.
16. 10. 2008
Tomáš Hnilica: Visualization of large FEM models
Abstract:
Today, numerical simulation is an indispensable part of computer aided product development chain and FEM simulations are widely used. The amount of data, produced by a finite element calculation places a particular challenge to scientific visualization. Several optimization techniques and visualization approaches are known. The aim of my work is to develop fast visualization engine for very large FEM models and simulation results that will be processed on commodity hardware. In this presentation the overview of FEM model structure and simulation results will be given. Survey to known optimization approaches and their feasibility for FEM will be discussed.
23. 10. 2008
Jan Pomikálek: Even larger web corpora
Abstract:
Text corpora are a valuable resource for many fields in computational linguistics. As a result of the Zipf law, many events in natural languages occur rarely and we often do not have enough data to be able to study these events. Even though there is an enormous amount of texts available on the web for some languages, the size of web corpora created to date have not yet exceeded 3 billion words. This presentation will describe a step-by-step procedure of creating a web corpus of English texts with a target size of 20 billion words. Related problems will be described with a main focus on detecting near-duplicate documents in billion-words text collections and an original efficient solution of the problem will be presented.
30. 10. 2008
Jiří FilipovičGeneral Purpose Computing on Graphics Hardware
Abstract:
The current fastest GPUs outperform today CPUs by about an order of magnitude in floating point arithmetic and memory bandwidth. Moreover, the CUDA (Compute Unified Device Architecture) enables developers to write algorithms for GPUs directly in C-like language, instead of reformulating them as graphics problems. These two facts make GPUs more attractive for general purpose computing. In this talk, the architecture and programmability of GPUs based on nVidia G80 will be presented. The examples of current successful GPU implementations outperforming the CPU ones will be introduced and basic optimization problems specific for GPUs will be discussed. I will conclude with outline of my research in this area.
6. 11. 2008
Ondřej Daněk: Graph Cut Based Image Segmentation
Abstract:
Image segmentation is one of the fundamental tasks in image analysis for which many methods have been proposed. In the recent years well-founded methods based on combinatorial graph cut algorithms emerged and have been successfully applied to a wide range of problems in vision and graphics. In this talk the basic idea behind the graph cut based image segmentation will be presented together with its main advantages and disadvantages and some preliminary results demonstrating the application of graph cut based methods in a fully automatic segmentation of cell nuclei clusters will be shown and discussed.
13. 11. 2008
Tomáš Čapek: State-of-the-art of semantic networks and lexicons
Abstract:
TBA
20. 11. 2008
Vojtěch Krmíček: High-Speed Network Traffic Acquisition and Preprocessing
Abstract:
The talk will present a design of high-speed network traffic acquisition subsystem suitable for agent-based intrusion detection systems. To match the performance requirements and to improve network traffic measurement, wire-speed data acquisition layer is based on hardware-accelerated probes, which provide real-time network traffic statistics. The network traffic is stored in collector servers and preprocessed data is then sent to detection agents that use heterogenous anomaly detection methods. Presented system is designed to improve the performance of agent-based intrusion detection systems and allow them to efficiently identify malicious traffic. The main contribution of presented system is its ability to aggregate real-time network-wide statistics from geographically dispersed probes. Traffic acquisition system is designed for deployment on high-speed backbone networks.
27. 11. 2008
Zdeněk Vrbka: 'Why are traditional testing approaches not sufficient in service systems?
Abstract:
At present, the world economy shifts from product paradigm to service paradigm. The service systems deal with complex problems that cannot be solved by using product approach. Service systems involve people, organizations, information and technology and their failure can have large, often catastrophe consequences. Therefore it is important to test these systems in proper way. The presentation illustrates the reasons why the traditional testing approaches are not sufficient in service system. It also draws attention to the necessity to study the problem of service systems testing.
4. 12. 2008
Martin Maška: A Two-Phase Cell Nucleus Segmentation Using Topology Preserving Level Set Method
Abstract:
An accurate localization of a cell nucleus boundary is inevitable for any further quantitative analysis of proteins, genes, chromosomes and other subnuclear structures inside the cell nucleus. In this talk, we present a novel approach to the cell nucleus segmentation in fluorescent microscope images exploiting the level set framework. The proposed method works in two phases. In the first phase, the image foreground is separated from the background to obtain a binary mask of individual cell nuclei as well as their clusters. The second phase is focused on the boundary detection of each cell nucleus within the previously identified clusters.
11. 12. 2008
Jan Sedmidubský: A Self-organized System for Content-based Search in Multimedia
Abstract:
We propose a self-organized system for content-based search in multimedia data. In particular, we build a semantic overlay over an existing peer-to-peer network. The self-organization of the overlay is obtained by using the social-network paradigm. The connections between peers are formed on the basis of a query-answer principle. The knowledge about answers to previous queries is exploited to route queries efficiently. At the same time, a randomized mechanism is used to explore new and unvisited parts of the network. In this way, a self-adaptable and robust system is built. Moreover, the metric-space data model is used to achieve extensibility. The proposed concepts are verified on a network consisting of 2,000 peers and indexing 10 million images.
18. 12. 2008
Poster Session

Jaro 2009

19. 2. 2009
Úvodní seminář jarního semestru
Informace o koncepci semináře v jarním semestru. Domluva programu semináře. Diskuse.
26. 2. 2009
Nikola Beneš: Partial Order Redution for State/Event LTL
Abstract:
Software systems assembled from a large number of autonomous components become an interesting target for formal verification due to the issue of correct interplay in component interaction. State/event LTL incorporates both states and events to express important properties of component-based software systems. The main contribution of this work is a partial order reduction technique for verification of state/event LTL properties. The core of the partial order reduction is a novel notion of stuttering equivalence which we call state/event stuttering equivalence. The positive attribute of the equivalence is that it can be resolved with existing methods for partial order reduction. State/event LTL properties are, in general, not preserved under state/event stuttering equivalence. To this end we define a new logic, called weak state/event LTL, which is invariant under the new equivalence.
5. 3. 2009
Jan Vykopal: Network-based Dictionary Attack Detection
Abstract:
This paper describes the novel network-based approach to a dictionary attack detection with the ability to recognize successful attack. We analyzed SSH break-in attempts at a flow level and determined a dictionary attack pattern. This pattern was verified and compared to common SSH traffic to prevent false positives. The SSH dictionary attack pat- tern was implemented using decision tree technique. The evaluation was performed in a large high-speed university network with promising results.
12. 3. 2009
Jan Vlach: Quantum marginal problem
Abstract:
A very interesting question is what multipartite Gaussian states can be prepared. This presentation deals with the relations between symplectic spectra of n-mode quantum systems and symplectic spectra of their single-mode subsystems. It shows results for two cases when both of the subsystems are in pure or in mixed quantum state. This talk provides brief review of so far achieved results and presents new methods and ideas used in looking for appropriate conditions valid for more general reductions to subsystems consisting of k modes.
19. 3. 2009
Jiří Materna: Czech Verbs in FrameNet Semantics
Abstract:
In the natural language processing field, there is a trend to build large electronic lexical databases based on semantic information. These resources are widely used in several applications like information retrieval, machine translation and even in disambiguation tasks on all levels. This work presents a method of automatic connecting verbs and their valencies in VerbaLex database to entries in Berkeley FrameNet. While completely manual work can take a long time, this automatic approach only requires a little bit of human effort to reach sufficient results. By linking VerbaLex to FrameNet, we are able to find a nontrivial subset of interlingual FrameNet frames (including their frame-to-frame relations), which could be used as a base for building FrameNet in Czech.
26. 3. 2009
Milan Češka: Local Quantitative LTL Model Checking
Abstract:
Quantitative analysis of probabilistic systems has been studied mainly from the global model checking point of view. In the global model-checking, the goal of verification is to decide the probability of satisfaction of a given property for all reachable states in the state space of the system under investigation. On the other hand, in local model checking approach the probability of satisfaction is computed only for the set of initial states. In theory, it is possible to solve the local model checking problem using the global model checking approach. However, the global model checking procedure can be significantly outperformed by a dedicated local model checking one. In this paper we present several particular local model checking techniques that if applied to global model checking procedure reduce the runtime needed from days to minutes.
2. 4. 2009
Vojtěch Kovář: Automatic Processing of Czech Syntax
Abstract:
The goal of the natural language syntactic analysis is to reveal the surface structure of the input text, or input sentence, respectively. It can be viewed as a "corner stone" of any complex natural language processing tasks, ranging from intelligent searching in the text to question answering systems and complex information analysis of the input text. In the presentation, we give an overview of the current approaches to the automatic syntactic analysis of the Czech language. We will describe formalisms used for encoding syntactic information, available annotated data and measuring techniques as well as selected parsers and parsing algorithms. Also, most remarkable problems in this field will be discussed and their possible solutions outlined.
9. 4. 2009
Roman Žilka: Electronic Micropayment Schemes
Abstract:
When you're paying your health insurance, a conference entrance fee or sending money to your employees, it's perfectly reasonable to use ordinary bank transfers. These payments are infrequent enough and high enough to justify the complex transfer operation. However, it would be an overkill to use classic transfers to purchase separate web articles worth 1 CZK each as you browse an e-zine site, or separate video clips as you browse a vlogging site. These frequent and tiny payments need special handling. Albeit not exactly widespread around the Net these days, some ideas for the future, such as on-demand computing, call for a versatile and simple scheme to allow for these "micro-payments", as the self-explanatory term goes. This talk will review two such schemes.
16. 4. 2009
Ziad Salem: Current Research Projects at the Electrical and Electronic Engineering Faculty, Computer Engineering Department Aleppo University.
Abstract:
The talk will start by giving some information about Syria, then Aleppo University, then master course we run at the Computer Engineering Department, Electrical and Electronic Engineering Faculty. Then I will give a brief description about the project which I supervise there.
23. 4. 2009
Vojtěch Krmíček: NetFlow Based Monitoring in the FEDERICA Project
Abstract:
An important part of both nowadays and future high-speed networks is reliable and detailed traffic monitoring. Especially important is the traffic monitoring in research networks like FEDERICA. These networks consist of physical layer and virtual layers above. Researches need to know detailed statistics about traffic in the virtual layers in virtual networks, which are complicated to obtain by standard monitoring tools. This presentation describes a concept of virtual network monitoring based on extended NetFlow. The reconstruction of virtual network topology as well as discuss the use of fine grain flows method will be presented.
30. 4. 2009
J. Plhák: Dialogue-Based Processing of Web Presentation
Abstract:
In this presentation, we describe the basic methods and technologies used in the BrowserWebGen system prototype, that allows the blind users to develop their own web presentation using a dialogue. This approach benefits especially from a cooperation with the web browsers and screen reader software. The basic principles of BWG system are discussed as well as the comparison of this system and the VoiceXML solution. As an illustration, we provide an example showing how the blind can create the web page.
7. 5. 2009
Martin Šmérek: I/O-efficient Binary Decision Diagram Manipulation
Abstract:
Model checking is a popular approach for formal verification of reactive systems. However, usage of this method is limited by so-called state space explosion. One way to cope with this problem is to represent the model and the state space symbolically by using Binary Decision Diagrams (BDDs). Unfortunately, during the computation the BDD can become too large to fit into the available main memory and it becomes essential to minimize the number of I/O operations. We present an extension of existing algorithms for BDD manipulation and propose a new I/O-efficient algorithm for computing the existential quantification.
7. 5. 2009
Ivan Fialík: Cryptographic Applications of Pseudo-Telepathy Games
Abstract:
Communication complexity is an area of classical computer science which studies how much communication is necessary to solve various distributed computational problems. Quantum information processing can be used to reduce the amount of communication required to carry out some distributed problems. We speak of pseudo-telepathy when it is able to completely eliminate the need for communication. After introducing a general model for pseudo-telepathy games and a few necessary cryptographic definitions, we describe a simple user identification protocol based on playing some pseudo-telepathy game by the parties.