"Through algorithms and artificial intelligence (AI), objects and digital services now demonstrate new skills they did not have before, right up to replacing human activity through pre-programming or by making their own decisions. As part of the internet of things, AI applications are already widely used today, for example in language processing, image recognition and the tracking and processin…
This book is about the longevity of digital surrogates of historical photographs. The preservation of digital photos is considered in the context of long-term access to digital objects in general. There is a general view among archivists, librarians and museum professionals that analogue originals and their digital counterparts are closely related. The features of a digital surrogate, such as a…
Covering the authors' own state-of-the-art research results, this book presents a rigorous, modern account of the mathematical methods and tools required for the semantic analysis of logic programs. It significantly extends the tools and methods from traditional order theory to include nonconventional methods from mathematical analysis that depend on topology, domain theory, generalized distanc…
"The Ars Edendi Lectures have been organized by the research programme at Stockholm University funded by Riksbankens Jubileumsfond during the years 2008-2015, with a focus on editorial methods for dynamic textual traditions of medieval Greek and Latin texts. This fourth volume gathers contributions both on the fundamentals of editing, as in Glenn Most ‘What is a critical edition?’, and look…
Atari’s 1981 arcade hit Tempest was a “tube shooter” built around glowing, vector-based geometric shapes. Among its many important contributions to both game and cultural history, Tempest was one of the first commercial titles to allow players to choose the game’s initial play difficulty (a system Atari dubbed “SkillStep”), a feature that has since became standard for games of all t…
In the second half of the 1990s Christian Mauduit and András Sárközy [86] introduced a new quantitative theory of pseudorandomness of binary sequences. Since then numerous papers have been written on this subject and the original theory has been generalized in several directions. Here I give a survey of some of the most important results involving the new quantitative pseudorandom measures o…
VOICECONET: A Collaborative Framework for Speech-Based Computer Accessibility with a Case Study for Brazilian Portuguese
Toward Computational Processing of Less Resourced Languages: Primarily Experiments for Moroccan Amazigh Language
Modeling Human-Computer Interaction in Smart Spaces: Existing and Emerging Techniques
Large parts of our culture are characterised by a blurring of corporeal and virtual realities. Computer industry has opened up all sorts of hybrid spaces and has itself, in turn, become a space for uncountable fantasies and expectations. Which are the principles of cultural order depicted by this desire commonly termed cyberspace? In offering novel narratives of modern architecture the author e…