Program seminářů pro rok 2006/2007
- 22. 9. 2006
- Úvodní seminář podzimního semestru
Informace o koncepci semináře v podzimním semestru.
Domluva programu semináře.
- 5. 10. 2006
- J. Pomikalek: WebBootCaT: a web tool for instant corpora
We present a web service for quickly producing corpora for specialist areas, in any of a range of languages, from the web. The underlying BootCaT tools have already been extensively used: here, we present a version which is easy for non-technical people to use as all they need do is fill in a web form. The corpus, once produced, can be either downloaded or loaded into the Sketch Engine, a corpus query tool, for further exploration. Reference corpora are used to identify the key terms in the specialist domain.
- 12. 10. 2006
- P. Hanks: The English Language: an International Medium of Communication
What is the most widely spoken language in the world? Not English, but Chinese. Nevertheless, English has a claim to be regarded as the world's most important language for purposes of international communication. Why? My talk begins with a comparison of the world's major languages today – Chinese, Spanish, Arabic, Hindi/Urdu, French, German, etc.--and their different roles in the world community. I then go on to explain the origins, history, and development of English. How did a minor West Germanic dialect, spoken on a rain-sodden offshore island, with a comparatively short history, manage to become a world language? What did Old English sound like, and what were the factors that changed it into modern English? What is the basic sound pattern of English that every speaker of English—especially foreign speakers--ought to know about? How is the vocabulary of English changing, worldwide—and how does English influence other languages? What does this tell us about the world we live in? My argument is that the English language has long since ceased to be the property of any one nation or social group. English is an inexhaustible, robust resource. As such it must be clearly separated from the history of colonization or the political hegemony of the world's superpower (whose leaders happen to use a form of English). English is a resource for all members of the world community, and standards of correct English are to be sought and defined as international standards of communicative clarity, not in the preferences or history of any one nation or dialect.
- 19. 10. 2006
- B. Zimmerová: A formal specification language for modelling component interactions.
Current software engineering calls for cost-effective development techniques that guarantee dependable and flexible software products. One of such techniques is the component-based development, which is based on software composition from autonomous components. In this context, a new verification issue arises. It concerns the correctness of interaction among components, which are usually delivered by different developers. My PhD topic focuses on this issue. In the talk, I will introduce a formal specification language that we have introduced for modelling component interactions, and will briefly discuss concrete problems that I aim to address in my dissertation, including my preliminary results.
- 26. 10. 2006
- J. Šprojcar: Elections, Modularity, and Anonymous Channels
We concentrate on electronic election as a cryptographic task. In particular we are interested in its properties and protocols for its implementation. We argue that this task is very complicated and propose that modularity approach can be used to simplify it. Modularity approach consists of taking small protocols and building a bigger one on top of them. The question is whether election task can be decomposed into small (meaningful) primitives. We present one of these primitives - an anonymous channel and show a way of using this primitive in design of election schemes.
- 2. 11. 2006
- P. Moravec: Distributed State Space Reductions
Model checking has become a popular approach used for formal verification of concurrent systems. However so called state space explosion problem limits usage of the approach. Several methods coping with state space explosion has been proposed. Since those methods are often based on orthogonal ideas, there is a natural question how to combine them. We present combinations of two such approaches: distributed verification and reductions of state spaces (namely partial order reduction, symmetry reduction and tau-confluence reduction).
- 9. 11. 2006
- P. Lidman: Using Machine Learning in Human Risk Assessment
Human Risk Assessment is a rapidly developing biomedical field which aims to efficiently repair health and environmental damage - and, where possible, prevent it. Modern diagnostic tools and approaches generate a great deal of useful data which are difficult or even impossible to analyse by means of statistics, let alone manually, just using expertise. This talk introduces two such datasets, National Cancer Registry and Microarrays - a gene expression data extraction technique. The talk will proceed with a discussion of the prediction and knowledge discovery potential of machine learning methods in such datasets describe the challenges that lay ahead.
- 16. 11. 2006
- M. Hejč: Measurement and Improvement of Data Quality
Use of data is one of the typical tasks of modern software. Data originate from various sources and they are burdened by various types of “noise”. I will present current state of the terminology and methodologies of dealing with these shortcomings. Since there is no common approach, I will identify the direction of my future research in this area – to find common methodology. I will also present results of some case studies.
- 23. 11. 2006
- H. Mlnařík: LanQ - a Quantum Programming Language
The interest in the quantum information processing was stimulated by development of quantum algorithms and protocols that are capable of solving certain problems faster or more securely then their classical counterparts. This led to creation of quantum programming languages and quantum process algebras. Combining features of these programming languages and process algebras led to development of a new programming language - LanQ. LanQ is an imperative quantum programming language that offers tools for new process creation and interprocess communication. It allows programmer to work with both classical and quantum data. In the talk, we introduce the language.
- 30. 11. 2006
- T. Masopust: Self-Regulating Finite Automata
The presentation introduces and discusses self-regulating finite automata defined as finite automata that regulate the use of their rules by a sequence of rules applied during previous moves. A special attention is paid to turns defined as moves during which a self-regulating finite automaton starts a new self-regulating sequence of moves. Based on the number of turns, two infinite hierarchies of language families resulting from two variants of these automata are established.
- 7. 12. 2006
- M. Kasík: Deconvolution of images acquired in Fluorescence Microscopy with space-variant Point Spread Function
Images acquired by fluorescence microscopy are blurred by point-spread function. Before processing such data we need to remove the blur. Process, which realizes this sharpening of image, is called deconvolution. For performing this class of algorithms, we need appropriate point-spread function (PSF). In the talk we will present results of application of deconvolution with incorrect PSF. We will also introduce a solution of this problem.
- 14. 12. 2006
- M. Grac: Machine Translation of Close Languages
Machine translation is a part of computational linguistics that investigates the use of software to translate text or speech from one language to another. This is one of the most complex tasks in natural language processing. Our research concentrates on translation between close languages: Czech and Slovak. In the presentation we will cover current state of the art.
- 22. 2. 2006
- Úvodní seminář jarního semestru
- Informace o koncepci semináře v jarním semestru. Domluva programu semináře. Diskuse.
- 1. 3. 2007
- D. Klusáček: Grid Scheduling Simulator
Effective job scheduling in the context of Grid computing introduces complex problem often solved by simplified techniques. This presentation concentrates on the design of a system intended for study of advanced scheduling techniques for planning various types of jobs in Grid environment. The solution is able to deal with common problems of job scheduling in Grids like heterogeneity of jobs and resources, and dynamic runtime changes such as arrival of new jobs. GridSim simulation toolkit was extended to provide a simulation environment that supports simulation of varying Grid scheduling problems. We implemented an experimental centralized Grid scheduler which uses local search based algorithms and dispatching rules for schedule generation. Interesting experimental results comparing the quality of optimization and time performance will be presented.
- 8. 3. 2007
- P. Drášil: Achieving reusability in e-learning
Reusability is one of the most desirable features of modern teaching/learning approaches, particularly e-learning. Historically, this issue was tackled by a standardization of metadata and packaging formats for electronic learning materials. However, it has shown that this is not enough and that the pedagogy behind is very important aspect of every learning material. Instructional design techniques allowing teachers to describe and consequently reuse the whole teaching/learning scenarios were therefore developed and they currently are on their way to common practice. The talk will provide an overview of instructional design techniques and their possible benefits.
- 15. 3. 2007
- P. Beneš: Computation of Tunnels in Protein Molecules based on Computational Geometry
Long-term research into the biochemical characteristics of protein molecules has the discovery, that the protein reactivity is closely related to the presence of tunnels leading from the protein surface to a biochemically relevant cavity inside the protein molecule. Our research concentrates on computation of these tunnels in static protein molecules and analysis of behaviour of tunnels in sequences of molecule snapshots in time. The methods we propose are based on computational geometry – the Voronoi diagram and the Delaunay triangulation in particular.
- 22. 3. 2007
- B. Kozlíková: Visualization of Protein Molecules and Tunnels in Three-Dimensional Space
Proteins are one of the most important molecules in the live organism so the analysis of these molecules is crucial in the process of designing new drugs. The next important step after the analysis is the visualization of the results which can provide much more intuitive and deeper view of the complex structure. We are trying to enable the user to explore the structure of the molecule and its tunnels using various techniques which suppress the unimportant parts of the molecule and stress the important ones. The molecule is not a static system. It is influenced by its surroundings so we can define the behaviour of the molecule in the time space. Our research in this area is focused on the invention of some techniques which can compute and show only the substantial movements of atoms. In the field of tunnel visualization we have to come up with some novel approaches which will enable us to explore the path from the outside of the molecule into its active site where the chemical reactions proceed.
- 29. 3. 2007
- P. Šimeček: LTL model checking with I/O Efficient Accepting Cycle Detection
We show how to adopt existing non-DFS-based algorithm OWCTY for accepting cycle detection to the I/O efficient setting and compare the I/O efficiency and practical performance of the adopted algorithm to the existing I/O efficient LTL model checking approach of Edelkamp et al. We show that while the new algorithm exhibits similar I/O complexity with respect to the size of the graph, it avoids the quadratic increase in the size of the graph of the approach of Edelkamp et al. Therefore, the absolute numbers of I/O operations are significantly smaller and the algorithm exhibits better practical performance.
- 5. 4. 2007
- O. Oladimeji: Survey of Digital Systems Test and Simulation Techniques
The Digital System has brought a major revolution to virtually all facet of human endeavour. Digital electronics circuits are shrinking in physical size while thein capabilities and speed of operation increases. To date, the miniaturised nature of digital system circuit has placed computing powers in the hands of several users with application ranging from home entertainment system and through hand held system (otherwise known as embeded system) to large computer systems . However, dependence on these systems calls for the production of systems which must be highly reliable . One major problem which rocks the Digital system revolution is the problem of test and simulation. The increase in size and complexity of circuits placed on a chip with little or no increase in the number of input and output pins effectively creates a bottleneck. Tests must detect not only failures in individual units only but also failures caused by defective manufacturing process. In this presentation, we examine the traditional aproach for generating stimuli (input vectors) and their application to combinational and sequential circuits. Various test algorithms such as the D-Algorithm, Path sensitization, Boolean diference Algorithm , Automatic Test Pattern generation (ATPG) Algorithm were also examined . Overall evaluation reveals the associated problems and feasible solutions were presented.
- 12. 4. 2007
- J. Chaloupka: New Distributed Algorithm for Decomposition of Graphs into SCCs
Decomposing a directed graph into its strongly connected components is one of the basic graph problems. It has many applications, among others in analysis of computer systems. It can be solved in linear time. However, graphs modelling complex computer systems tend to be very big which makes it hard to handle them on a single machine. One way to tackle this problem is to distribute the graph across a cluster of workstations. Unfortunately, the linear sequential algorithm is unusable in such a setting. Several distributed algorithms for SCC decomposition have been proposed. We present a new distributed algorithm and show its performance in our experiments.
- P. Klika: Software assistant supporting medical processes
This work was motivated by issues in current IT systems in medicine, which include especially: lack of appropriate data caused by an insufficient integration of their sources; the huge volume of data presented to physicians without any relevance in a given context; or information of doubtful quality and no indication as to the quality. One of the big challenges in medical systems is the support of therapeutical practice guidelines. This theme covers several levels of support - from informing doctors of the existence of appropriate guidelines and their steps, through guiding them through the chosen process up to the validation of guidelines' correctness itself. Systems supporting the practice guidelines have to satisfy a number of prerequisities, which include context modelling, relevance determination and complex similarity searching. In this presentation these requirements and principles of possible solutions will be discussed.
- 19. 4. 2007
- I. Peterlík: An Algorithm of State-Space Precomputation Allowing Non-linear Haptic Deformation Modelling
An interesting area of virtual reality research is the haptic interaction with deformable objects. The user is equipped with a haptic device with force feedback. Using a probe (virtual object) during the interaction she forces the deformable body to change its shape. The models are used in implementation of surgical simulators which allow the users to perform virtual operations (applicable in surgical training and complex operation planning). Realistic modelling based on a physical formulation is computationally expensive. Usually, large systems of non-linear equations must be solved in each step. On the other hand, realistic haptic behavior requires high refresh rate (over 1 kHz) and therefore, the real-time computations are not feasible. In my talk I will describe a new algorithm which allow haptic interaction with computationally demanding models. The algorithm is based on distributed state-space precomputation and interpolation of the precomputed data during the interaction. Further, I will give some preliminary results concerning the accuracy of the algorithm and I will also describe some modifications allowing more complex operations such as cutting and tearing of the tissue.
- 26. 4. 2007
- T. Rebok: DiProNN: VM-based Distributed Programmable Network Node Architecture
The active network approach allows an individual user to inject customized programs into an active nodes in the network, usually called programmable/active nodes, and thus process data in the network as it passes through. As the speeds of network links still increase, and subsequently, the applications' demands for the network bandwidth increase as well, a single active node is infeasible to process such high-bandwidth user data in real-time, since the processing may be fairly complex. In my talk I will present the architecture of DiProNN node---the VM-based Distributed Programmable Network Node, that improves the scalability of such an active system with respect to number of active programs simultaneously running on the node and with respect to the bandwidth of each passing stream processed. Since the node is primarily meant to perform stream processing, and to make programming of streaming applications for DiProNN node easier, I will also present suitable modular programming model which takes advantages of DiProNN virtualization and makes its programming more comfortable.
- J. Šprojcar: Anonymous Channels
Abstract: The talk will focus on cryptographic primitives called anonymous channels. These are cryptographic protocol that deal with anonymity of parties. e.g. a channel where the sender of a message is anonymous (in a set of possible senders). We will present several anonymous channels and applications as examples. A method, called dining cryptographers nets, introduced by D. Chaum in 1988 which implements anonymous broadcast channel is also briefly presented. We end our talk with one of our contributions - formalization of anonymous channels.
- 3. 5. 2007
- V Němčík: Anaphora Resolution
At present, anaphora resolution is one of the biggest challenges in the field of natural language understanding. Despite anaphora plays an important role in human communication and is in its essence an interdisciplinary issue, it is not widely familiar to computer scientists. Therefore, this talk provides a brief insight into the relevant linguistic background, and an overview of various types of anaphora and their computational aspects. Finally, it sketches various research aims on the way towards a badly-needed anaphora resolution system for Czech.
- L. Boháč: Programmable quantum processors - classification and equivalence
Quantum information processing has recently become very important and challenging branch of computer science. The laws of microscopic quantum world can be exploited for speeding up computations or for implementation of more secure cryptographic schemes. A programmable quantum processor could be the heart of a quantum computer. Nowadays, experimenters set up a special device for each task they want to perform. Such a device is controlled externally using classical parameters. Contrary to that, a programmable quantum processor is a fixed device and it can be programmed quantumly - it is programmable by states of a quantum system. There is no universal processor for the basic deterministic model. New types of processors -- probabilistic and approximative -- were developed, for which universal processors can be designed. Basic concepts and properties, especially equivalence, of different types of programmable quantum processors will be presented.
- 10. 5. 2007
- J. Sedmidubský: Metric Social Networks
The area of similarity searching is a very hot topic for both research and commercial applications. Current data processing applications use data with considerably less structure and much less precise than traditional database systems. Examples are multimedia data like images or videos that offer query-by-example search and can't be meaningfully searched by precise database queries. One approach to similarity search emerges from the notion of social network. Social network refers to a social structure of people, related to each other through a common relation or interest. Our approach places the peers of the distributed access structure in the role of people in the social network and creates relationships among them according to the similarity of the peer's data. The query processing then represents the search for the community of people - peers related by common interest - similar data.
- I. Fialík: Pseudo-Telepathy Games
Quantum information processing is at the crossroads of physics, mathematics and computer science. It investigates what we can and cannot do with quantum information that goes beyond the abilities of classical information processing. Communication complexity is an area of classical computer science which studies how much communication is necessary to solve distributed computational problems. Quantum information processing can be used to reduce the amount of communication required to carry out some distributed problems. We speak of pseudo-telepathy when it serves to completely eliminate the need for communication. After a brief overview of the principles of quantum mechanics, we introduce the model for pseudo-telepathy games. As an example of a pseudo-telepathy game, we describe the magic square game.
- 17. 5. 2007
- Poster Session