UCLIC Seminar: End user engagement with data

Speaker: Paul Marshall, University of Bristol
UCL Contact: Aneesha Singh (Visitors from outside UCL please email in advance).
Date/Time: 17 Jan 18, 15:00 - 16:00
Venue: Room 405, 66-72 Gower Street

Abstract

Computer science is going through something of a data revolution, with new mathematics and algorithms being developed to make sense of the global explosion of data. However, the perspective of most of this work has been on what insights large organisations - governments, IT companies or the NHS - can derive from huge repositories. Even when the focus has been on end users, such as in consumer devices like activity trackers, technologies often embody paternalistic and inflexible views about behaviours that should be changed. In this talk, I will describe a number of case studies where my students and post docs have tried to identify the perspectives of individuals and groups of end users of data technologies. I hope to discuss with the audience what the future of end user data technologies may be.

InfoSec Seminar: Adversarial Machine Learning

Speaker: Jamie Hayes, UCL-CS
UCL Contact: Vasilios Mavroudis (Visitors from outside UCL please email in advance).
Date/Time: 18 Jan 18, 16:00 - 17:00
Venue: New Quad Pop Up 102

Abstract

This talk gives an overview of his two recent papers on use cases of adversarial machine learning. The first half presents the first ever method for information hiding using machine learning that is competitive with more established techniques. The second half presents a case study into privacy leakage of generative models. With no knowledge of the training algorithm, model type or architecture, model parameters, or data distribution, we show an attacker can learn which data items were used to train a generative model, which is potentially privacy sensitive information.

Jamie Hayes

Jamie Hayes is a PhD student from University College London, UK. His research focuses on the intersection of privacy, security and machine learning.

Grand Challenges Seminar: Life Outside the Real world

Speaker: Simon Julier, Niloy Mitra, David Swapp, Anthony Steed, UCL-CS
UCL Contact: Sarah Turnbull (Visitors from outside UCL please email in advance).
Date/Time: 26 Jan 18, 12:00 - 14:00
Venue: Torrington (1-19) G13

Abstract

Life Beyond the Real World Interface technologies are changing very rapidly. Augmented reality and virtual reality displays are now becoming feasible at consumer prices. In parallel, sensing system and computation are becoming ubiquitous. The real world is becoming digitised and at the same time parallel virtual worlds are being built. This set of talk will explore the computer science challenges of these new technologies and how they can foster new inter-disciplinary areas. Simon Julier: Mixed Reality Systems Mixed Reality (MR) basically uses techniques to change the way we perceive the world. Using them we can change the world in arbitrary ways. We could fill it with people and things which never were or no longer exist (history, gaming, narratives and story telling), reveal hidden facts and information (data visualisation, instructions, monitoring), and make any part of the world a space for digital interaction (creativity, collaboration). However, the potential is limited by current systems. These use head worn or hand held displays, high precision tracking systems and comprehensive geometric models of the environment. Users are presented with novel interfaces they have had little opportunity to experience and gain mastery of. Future challenges in creating MR systems must overcome these barriers. New types of displays - or eliminating display altogether - need to be investigated. Tracking systems that can leverage understanding of the environment must be used. New methods for intuitive interaction must be developed. Niloy Mitra: Capturing Interactions in Crowded Environments Digitizing the physical world is critical for many emerging fields such as virtual and augmented reality, smart home systems, robotics, assisted living, etc. However, capturing interactions and object layouts from a single view is challenging due to heavy inter-object and human-object occlusions. We will describe our ongoing efforts to solve this grand challenge. If solved, the captured scenes and interactions will allow us to redesign our working and living spaces by taking into account our personal preferences and habits. David Swapp: Understanding and Using Perception Development of successful VR systems has always been contingent on understanding human perceptual mechanisms, then exploiting this knowledge to create systems of artificial stimuli that can elicit realistic responses from people. Conversely, we can further our understanding of human behaviour at a variety of levels (psychophysical, cognitive, sociological) by observing human responses to controlled virtual environments. A selection of research projects at the cave lab can illustrate these ideas. Anthony Steed: Telepresence and Challenges Telepresence technologies allow the user to experience another place in real-time. While most of us use video conferencing, we know that it can be awkward to use. In the lab we have various robots that allow us to experience remote places through audio, vision and tactile, as if we were physically located there. I will talk about some of the upcoming challenges here, in particular computation speed and latency issues. We will finish with a discussion about the computer science research challenges of this class of user interface.