translated by Google

Machine-translated page for increased accessibility for English questioners.

Seminar program for 2006/2007

Autumn 2006

22. 9. 2006
Introductory Seminar Seminar
Program:
Information on Seminar Concept in the Autumn Semester.
Agenda of the seminar.
Discussion.
5. 10. 2006
J. Pomikalek: WebBootCaT: a web tool for instant corpora
Abstract:
We present a web service for quickly producing corpora for special areas, in any of a range of languages, from the web. The underlying BootCaT tools have already been extensively used: here, we present a version that is easy for non-technical people to use as all they need to do is fill in a web form. The corpus, once produced, can either be downloaded or loaded into the Sketch Engine, and a corpus query tool, for further exploration. The corpora references are used to identify the key terms in the specialist domain.
12. 10. 2006
P. Hanks: The English Language: an International Medium of Communication
Abstract:
What is the most widely spoken language in the world? Not English, but Chinese. However, English has a claim to be the world's most important language for the purposes of international communication. Why? My talk begins with a comparison of the world's major languages ​​today - Chinese, Spanish, Arabic, Hindi / Urdu, French, German, etc. - and their different roles in the world community. I then go on to explain the origins, history, and development of English. How did a minor West German dialect, spoken on a rain-slick offshore island, with a comparatively short history, manage to become a world language? What did Old English sound like, and what were the factors that changed it into modern English? What is the basic sound pattern of English that every speaker of English - especially foreign speakers - ought to know about? What is the vocabulary of English changing, and how does English influence other languages? What does this tell us about the world we live in? Our argument is that English language has long since ceased to be the property of any one nation or social group. English is an inexhaustible, robust resource. As such, it must be clearly separated from the history of colonization or the political hegemony of the world's superpower. English is a resource for all members of the world community, and standards of correct English are to be sought and defined as international standards of communicative clarity, not in the preferences or history of any nation or dialect.
19. 10. 2006
B. Zimmer: A formal language specification for modeling component interactions.
Abstract:
Current software engineering calls for cost-effective development techniques that guarantee dependable and flexible software products. One of these techniques is the component-based development, which is based on software composition from autonomous components. In this context, a new verification issue arises. It concerns the correctness of interaction between components, which are usually delivered by different developers. My PhD topic focuses on this issue. In the talk, I will introduce a formal specification language that we have introduced for modeling interactions, and will briefly discuss concrete problems that I aim to address in my dissertation, including my preliminary results.
26. 10. 2006
J. Šprojcar: Elections, Modularity, and Anonymous Channels
Abstract:
We concentrate on electronic elections as a cryptographic task. In particular we are interested in its properties and protocols for its implementation. We argue that this task is very complicated and suggests that a modular approach can be used to simplify it. Modularity approach consists of taking small protocols and building a larger one on top of them. The question is whether the election task can be decomposed into small (meaningful) primitives. We present one of these primitives - an anonymous channel and show a way of using this primitive in designing election schemes.
2. 11. 2006
P. Moravec: Distributed State Space Reductions
Abstract:
Model checking has become a popular approach used for formal verification of concurrent systems. However, the so-called state space explosion limits the use of the approach. Several methods dealing with state space explosion have been proposed. Since these methods are often based on orthogonal ideas, there is a natural question how to combine them. We present combinations of two such approaches: distributed verification and reductions of state spaces (symmetry reduction and tau-confluence reduction).
9. 11. 2006
P. Lidman: Using Machine Learning in Human Risk Assessment
Abstract:
Human Risk Assessment is a rapidly developing biomedical field that aims to efficiently repair health and environmental damage - and, where possible, prevent it. Modern diagnostic tools and approaches generate a great deal of useful data that are difficult or even impossible to analyze by means of statistics, let alone manually, just using expertise. This talk introduces two such datasets, the National Cancer Registry and Microarrays, and a gene expression data extraction technique. The discussion will be followed by a discussion of the prediction and knowledge discovery potential of machine learning methods in such datasets describing the challenges ahead.
16. 11. 2006
M. Hejč: Measurement and Improvement of Data Quality
Abstract:
Use of data is one of the typical tasks of modern software. Data originated from various sources and they are burdened by various types of "noise". I will present the current state of terminology and methodology of dealing with these shortcomings. Since there is no common approach, I will identify the direction of my future research in this area - to find common methodology. I will also present the results of some case studies.
23. 11. 2006
H. Mlnařík: LanQ - and Quantum Programming Language
Abstract:
The interest in quantum information processing was stimulated by the development of quantum algorithms and protocols that are capable of solving certain problems faster or more securely than their classical counterparts. This led to the creation of quantum programming languages ​​and quantum process algebras. Combining features of these programming languages ​​and process algebra led to the development of a new programming language - LanQ. LanQ is an imperative quantum programming language that offers tools for new process creation and interprocess communication. It allows the programmer to work with both classical and quantum data. In the talk, we introduce the language.
30. 11. 2006
T. Masopust: Self-Regulation Finite Automatic
Abstract:
The presentation introduces and discusses automated finite automata defined as automatic finite that regulates the use of their rules by a sequence of rules applied during previous moves. A special attention is paid to turns defined as moves during which a self-regulating finite automaton starts a new self-regulating sequence of moves. Based on the number of turns, two infinite hierarchies of language families resulting from two variants of these automata are established.
7. 12. 2006
M. Kasik: Deconvolution of images acquired in Fluorescence Microscopy with space-variant Point Spread Function
Abstract:
Images acquired by fluorescence microscopy are blurred by point-spread function. Before processing such data we need to remove the blur. The process, which realizes this sharpening of the image, is called deconvolution. For performing this class of algorithms, we need an appropriate point-spread function (PSF). In the talk we will present the results of application of deconvolution with incorrect PSF. We will also introduce a solution to this problem.
14. 12. 2006
M. Grac: Machine Translation of Close Languages
Abstract:
Machine translation is a part of computational linguistics that investigates the use of software to translate text or speech from one language to another. This is one of the most complex tasks in natural language processing. Our research focuses on translation between close languages: Czech and Slovak. In the presentation we will cover the current state of the art.

Spring 2007

22. 2. 2006
Introductory Seminar of the Spring Semester
Information on the seminar concept in the spring semester. Agenda of the seminar. Discussion.
March 1, 2007
D. Klusacek: Grid Scheduling Simulator
Abstract:
Effective job scheduling in the context of Grid computing introduces a complex problem often solved by simplified techniques. This presentation focuses on the design of a system designed to study advanced scheduling techniques for various types of jobs in the Grid environment. The solution is able to deal with common problems of job scheduling in Grids such as heterogeneity of jobs and resources, and dynamic runtime changes such as the arrival of new jobs. GridSim simulation toolkit was extended to provide a simulation environment that supports simulation of varying Grid scheduling problems. We have implemented an experimental centralized Grid scheduler that uses local search based algorithms and dispatching rules for schedule generation. Interesting experimental results comparing the quality of optimization and time performance will be presented.
8 March 2007
P. Drasil: Achieving reusability in e-learning
Abstract:
Reusability is one of the most desirable features of modern teaching / learning approaches, especially e-learning. Historically, this issue was addressed by the standardization of metadata and packaging formats for electronic learning materials. However, it has shown that this is not enough and that the pedagogy behind is a very important aspect of every learning material. Instructional design techniques that allow teachers to describe and consequently reuse all teaching / learning scenarios have been developed and are currently on their way to common practice. The talk will provide an overview of instructional design techniques and their possible benefits.
15. 3. 2007
P. Beneš: Computation of Tunnels in Protein Molecules based on Computational Geometry
Abstract:
Long-term research into the biochemical characteristics of protein molecules has the discovery that protein reactivity is closely related to the presence of tunnels leading from the protein surface to a biochemically relevant cavity within the protein molecule. Our research focuses on the computation of these tunnels in static protein molecules and the analysis of the behavior of tunnels in sequences of molecule snapshots in time. The methods we propose are based on computational geometry - the Voronoi diagram and the Delaunay triangulation in particular.
22. 3. 2007
B. Kozlíková: Visualization of Protein Molecules and Tunnels in Three-Dimensional Space
Abstract:
Proteins are one of the most important molecules in the living organism, so the analysis of these molecules is crucial in the process of designing new drugs. The next important step after the analysis is the visualization of the results, which can provide much more intuitive and deeper view of the complex structure. We are trying to enable the user to explore the structure of the molecule and its tunnels using various techniques that suppress unimportant parts of the molecule and stress the important ones. The molecule is not a static system. It is influenced by its surroundings so we can define the behavior of the molecule in time space. Our research in this area focuses on the invention of some techniques that can only compute and show the substantial movements of atoms. In the field of tunnel visualization we have to come up with some novel approaches that will allow us to explore the path from the outside of the molecule to its active site where the chemical reactions proceed.
29. 3. 2007
P. Šimeček: LTL model checking with I / O Efficient Accepting Cycle Detection
Abstract:
We show how to adopt the existing non-DFS-based OWCTY algorithm for accepting cycle detection to I / O efficient setting and compare the I / O efficiency and practical performance of the adopted algorithm to the existing I / O efficient LTL model checking approach Edelkamp et al. We show that while the new algorithm exhibits similar I / O complexity with respect to the size of the graph, it avoids the quadratic increase in the size of the graph of the approach of Edelkamp et al. Therefore, the absolute numbers of I / O operations are significantly smaller and the algorithm exhibits better practical performance.
5. 4. 2007
O. Oladimeji: Survey of Digital Systems Testing and Simulation Techniques
Abstract:
The Digital System has brought a major revolution to virtually all aspects of human endeavor. Digital electronics circuits are shrinking in physical size while thein capabilities and speed of operation increases. To date, the miniaturized nature of the digital circuit system has placed computing power in the hands of several users with applications ranging from home entertainment systems to hand-held systems (other than embedded systems) to large computer systems. However, dependence on these systems calls for the production of systems which must be highly reliable. One major problem which rocks the Digital System Revolution is the problem of testing and simulation. The increase in size and complexity of circuits placed on a chip with little or no increase in the number of input and output pins effectively creates a bottleneck. Tests must not only detect failures in individual units but also failures caused by a defective manufacturing process. In this presentation, we examine the traditional approach for generating stimuli (input vectors) and their application to combinational and sequential circuits. Various test algorithms such as the D-Algorithm, Path Sensitization, Boolean Difference Algorithm, Automatic Test Pattern Generation (ATPG) Algorithm were also examined. Overall evaluation revealed the associated problems and feasible solutions were presented.
12. 4. 2007
J. Chaloupka: New Distributed Algorithm for Decomposition of Graphs into SCCs
Abstract:
Decomposing and directing graphs into its strongly connected components is one of the basic graph problems. It has many applications, among others in the analysis of computer systems. It can be solved in linear time. However, graphs of complex computer systems tend to be very large, making it hard to handle them on a single machine. One way to solve this problem is to distribute the graph across a cluster of workstations. Unfortunately, the linear sequential algorithm is unusable in such a setting. Several distributed algorithms for SCC decomposition have been proposed. We present a new distributed algorithm and show its performance in our experiments.
P. Klika: Software assistant supporting medical processes
Abstract:
This work was motivated by issues in current IT systems in medicine, which include, in particular, the lack of adequate data caused by an insufficient integration of their sources; the vast volume of data presented to physicians without any relevance in a given context; or information of doubtful quality and no indication as to the quality. One of the major challenges in medical systems is the support of the therapeutic practice guidelines. This topic covers several levels of support - from informing doctors about the existence of appropriate guidelines and their steps, through guiding them through the chosen process to validation of guidelines' correctness itself. Systems supporting the practice guidelines must meet a number of prerequisites, which include context modeling, relevance determination and complex similarity searching. In this presentation, these requirements and principles of possible solutions will be discussed.
19. 4. 2007
I. Peterlík: An Algorithm of State-Space Precomputation Allowing Non-linear Haptic Deformation Modeling
Abstract:
An interesting area of ​​virtual reality research is the haptic interaction with deformable objects. The user is equipped with a haptic device with force feedback. Using a probe (virtual object) during the interaction forces the deformable body to change its shape. The models are used in the implementation of surgical simulators that allow users to perform virtual operations (surgical training and complex operation planning). Realistic modeling based on a physical formulation is computationally expensive. Normally, large systems of non-linear equations must be solved in each step. On the other hand, realistic haptic behavior requires a high refresh rate (over 1 kHz) and therefore, real-time computations are not feasible. In my talk I will describe a new algorithm that allows haptic interaction with computationally demanding models. The algorithm is based on the pre-computed precomputation and interpolation of the precomputed data during the interaction. Further, I will give some preliminary results regarding the accuracy of the algorithm and I will also describe some modifications allowing more complex operations such as cutting and tearing the tissue.
26. 4. 2007
T. Rebok: DiProNN: VM-based Distributed Programmable Network Node Architecture
Abstract:
The active network approach allows an individual user to inject customized programs into active nodes in the network, usually called programmable / active nodes, and thus process data in the network as it passes through. As the speed of network links still increase, and consequently, the applications' demands for network bandwidth increase as well, a single active node is infeasible to process such high bandwidth user data in real-time, since processing may be fairly complex . In my talk I will present the architecture of the DiProNN node --- the VM-based Distributed Programmable Network Node that improves the scalability of such an active system with respect to the number of active programs running simultaneously on the node and with respect to the bandwidth of each passing stream processed. Since the node is primarily intended to perform stream processing, and to make programming of streaming applications for DiProNN node easier, I will also present a suitable modular programming model that takes advantage of DiProNN virtualization and makes its programming more comfortable.
J. Šprojcar: Anonymous Channels
Abstract:
Abstract: The talk will focus on cryptographic primitives called anonymous channels. These are cryptographic protocol that deal with anonymity of parties. eg a channel where the sender of a message is anonymous (in a set of possible senders). We will present several anonymous channels and applications as examples. A method, called dining cryptographers nets, introduced by D. Chaum in 1988 which implements an anonymous broadcast channel is also briefly presented. We end our talk with one of our contributions - formalization of anonymous channels.
3. 5. 2007
In Nemc: Anaphora Resolution
Abstract:
At present, anaphora resolution is one of the greatest challenges in the field of natural language understanding. Although anaphora plays an important role in human communication and is in its essence an interdisciplinary issue, it is not widely familiar to computer scientists. Therefore, this talk provides a brief insight into the relevant linguistic background, and an overview of various types of anaphora and their computational aspects. Finally, it sketches various research objectives on the way to a badly needed anaphora resolution system for Czech.
L. Boháč: Programmable quantum processors - classification and equivalence
Abstract:
Quantum information processing has recently become a very important and challenging branch of computer science. The laws of the microscopic quantum world can be exploited for speeding up computations or for implementing more secure cryptographic schemes. A programmable quantum processor could be the heart of a quantum computer. Nowadays, experimenters set up and a special device for each task they want to perform. Such a device is controlled externally using classical parameters. Contrary to this, a programmable quantum processor is a fixed device and it can be programmed quantumly - it is programmable by states of a quantum system. There is no universal processor for the basic deterministic model. New types of processors - probabilistic and approximative - have been developed, for which universal processors can be designed. Basic concepts and properties, in particular equivalence, of different types of programmable quantum processors will be presented.
10. 5. 2007
J. Sedmidubsky: Metric Social Networks
Abstract:
The area of ​​similarity searching is a very hot topic for both research and commercial applications. Current data processing uses considerably less structure and much less accurate than traditional database systems. Examples are multimedia data like images or videos that offer query-by-example search and can not be meaningfully searched by precise database queries. One approach to similarity search emerges from the notion of social network. Social network refers to a social structure of people linked to each other through a common relationship or interest. Our approach places the peers of the distributed access structure in the role of people in the social network and creates relationships among them according to the similarity of peer's data. The query processing then represents the search for the community of people - similar data.
I. Fialik: Pseudo-Telepathy Games
Abstract:
Quantum information processing is at the crossroads of physics, mathematics and computer science. It investigates what we can and can not do with quantum information that goes beyond the capabilities of classical information processing. Communication complexity is an area of ​​classical computer science that studies how much communication is necessary to solve distributed computational problems. Quantum information processing can be used to reduce the amount of communication required to carry out some distributed problems. We speak of pseudo-telepathy when it serves to completely eliminate the need for communication. After a brief overview of the principles of quantum mechanics, we introduce the model for pseudo-telepathy games. As an example of a pseudo-telepathic game, we describe the magic square game.
17. 5. 2007
Poster Session